Restructured rules to match architecture (#555)

* Restructured rules to match architecture

* Added exports of all symbols in the deprecated location for legacy support

* Updated examples
This commit is contained in:
UebelAndre 2021-03-12 08:54:14 -08:00 committed by GitHub
parent 8d522c3e4b
commit edbfa3bfa9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
111 changed files with 4694 additions and 4294 deletions

View File

@ -17,6 +17,5 @@ bzl_library(
"//for_workspace:bzl_srcs",
"//foreign_cc:bzl_srcs",
"//toolchains:bzl_srcs",
"//tools/build_defs:bzl_srcs",
],
)

View File

@ -139,7 +139,7 @@ http_archive(
And in the `BUILD.bazel` file, put:
```python
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "pcre",

View File

@ -3,11 +3,9 @@
- [boost_build](#boost_build)
- [cmake](#cmake)
- [cmake_tool](#cmake_tool)
- [ConfigureParameters](#ConfigureParameters)
- [configure_make](#configure_make)
- [ForeignCcArtifact](#ForeignCcArtifact)
- [ForeignCcDeps](#ForeignCcDeps)
- [InputFiles](#InputFiles)
- [make](#make)
- [make_tool](#make_tool)
- [native_tool_toolchain](#native_tool_toolchain)
@ -15,7 +13,6 @@
- [ninja_tool](#ninja_tool)
- [rules_foreign_cc_dependencies](#rules_foreign_cc_dependencies)
- [ToolInfo](#ToolInfo)
- [WrappedOutputs](#WrappedOutputs)
<a id="#boost_build"></a>
@ -366,28 +363,6 @@ Rule for building Ninja. Invokes configure script and make install.
| <a id="ninja_tool-ninja_srcs"></a>ninja_srcs | - | <a href="https://bazel.build/docs/build-ref.html#labels">Label</a> | required | |
<a id="#ConfigureParameters"></a>
## ConfigureParameters
<pre>
ConfigureParameters(<a href="#ConfigureParameters-ctx">ctx</a>, <a href="#ConfigureParameters-attrs">attrs</a>, <a href="#ConfigureParameters-inputs">inputs</a>)
</pre>
Parameters of create_configure_script callback function, called by
cc_external_rule_impl function. create_configure_script creates the configuration part
of the script, and allows to reuse the inputs structure, created by the framework.
**FIELDS**
| Name | Description |
| :------------- | :------------- |
| <a id="ConfigureParameters-ctx"></a>ctx | Rule context |
| <a id="ConfigureParameters-attrs"></a>attrs | Attributes struct, created by create_attrs function above |
| <a id="ConfigureParameters-inputs"></a>inputs | InputFiles provider: summarized information on rule inputs, created by framework function, to be reused in script creator. Contains in particular merged compilation and linking dependencies. |
<a id="#ForeignCcArtifact"></a>
## ForeignCcArtifact
@ -433,32 +408,6 @@ Provider to pass transitive information about external libraries.
| <a id="ForeignCcDeps-artifacts"></a>artifacts | Depset of ForeignCcArtifact |
<a id="#InputFiles"></a>
## InputFiles
<pre>
InputFiles(<a href="#InputFiles-headers">headers</a>, <a href="#InputFiles-include_dirs">include_dirs</a>, <a href="#InputFiles-libs">libs</a>, <a href="#InputFiles-tools_files">tools_files</a>, <a href="#InputFiles-ext_build_dirs">ext_build_dirs</a>, <a href="#InputFiles-deps_compilation_info">deps_compilation_info</a>,
<a href="#InputFiles-deps_linking_info">deps_linking_info</a>, <a href="#InputFiles-declared_inputs">declared_inputs</a>)
</pre>
Provider to keep different kinds of input files, directories, and C/C++ compilation and linking info from dependencies
**FIELDS**
| Name | Description |
| :------------- | :------------- |
| <a id="InputFiles-headers"></a>headers | Include files built by Bazel. Will be copied into $EXT_BUILD_DEPS/include. |
| <a id="InputFiles-include_dirs"></a>include_dirs | Include directories built by Bazel. Will be copied into $EXT_BUILD_DEPS/include. |
| <a id="InputFiles-libs"></a>libs | Library files built by Bazel. Will be copied into $EXT_BUILD_DEPS/lib. |
| <a id="InputFiles-tools_files"></a>tools_files | Files and directories with tools needed for configuration/building to be copied into the bin folder, which is added to the PATH |
| <a id="InputFiles-ext_build_dirs"></a>ext_build_dirs | Directories with libraries, built by framework function. This directories should be copied into $EXT_BUILD_DEPS/lib-name as is, with all contents. |
| <a id="InputFiles-deps_compilation_info"></a>deps_compilation_info | Merged CcCompilationInfo from deps attribute |
| <a id="InputFiles-deps_linking_info"></a>deps_linking_info | Merged CcLinkingInfo from deps attribute |
| <a id="InputFiles-declared_inputs"></a>declared_inputs | All files and directories that must be declared as action inputs |
<a id="#ToolInfo"></a>
## ToolInfo
@ -478,27 +427,6 @@ Information about the native tool
| <a id="ToolInfo-target"></a>target | If the tool is preinstalled, must be None. If the tool is built as part of the build, the corresponding build target, which should produce the tree artifact with the binary to call. |
<a id="#WrappedOutputs"></a>
## WrappedOutputs
<pre>
WrappedOutputs(<a href="#WrappedOutputs-log_file">log_file</a>, <a href="#WrappedOutputs-script_file">script_file</a>, <a href="#WrappedOutputs-wrapper_script">wrapper_script</a>, <a href="#WrappedOutputs-wrapper_script_file">wrapper_script_file</a>)
</pre>
Structure for passing the log and scripts file information, and wrapper script text.
**FIELDS**
| Name | Description |
| :------------- | :------------- |
| <a id="WrappedOutputs-log_file"></a>log_file | Execution log file |
| <a id="WrappedOutputs-script_file"></a>script_file | Main script file |
| <a id="WrappedOutputs-wrapper_script"></a>wrapper_script | Wrapper script text to execute |
| <a id="WrappedOutputs-wrapper_script_file"></a>wrapper_script_file | Wrapper script file (output for debugging purposes) |
<a id="#rules_foreign_cc_dependencies"></a>
## rules_foreign_cc_dependencies
@ -524,7 +452,7 @@ Call this function from the WORKSPACE file to initialize rules_foreign_cc de
| <a id="rules_foreign_cc_dependencies-ninja_version"></a>ninja_version | The target version of the ninja toolchain if <code>register_default_tools</code> or <code>register_built_tools</code> is set to <code>True</code>. | <code>"1.10.2"</code> |
| <a id="rules_foreign_cc_dependencies-register_preinstalled_tools"></a>register_preinstalled_tools | If true, toolchains will be registered for the native built tools installed on the exec host | <code>True</code> |
| <a id="rules_foreign_cc_dependencies-register_built_tools"></a>register_built_tools | If true, toolchains that build the tools from source are registered | <code>True</code> |
| <a id="rules_foreign_cc_dependencies-additional_shell_toolchain_mappings"></a>additional_shell_toolchain_mappings | Mappings of the shell toolchain functions to execution and target platforms constraints. Similar to what defined in @rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:toolchain_mappings.bzl in the TOOLCHAIN_MAPPINGS list. Please refer to example in @rules_foreign_cc//toolchain_examples. | <code>[]</code> |
| <a id="rules_foreign_cc_dependencies-additional_shell_toolchain_mappings"></a>additional_shell_toolchain_mappings | Mappings of the shell toolchain functions to execution and target platforms constraints. Similar to what defined in @rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:toolchain_mappings.bzl in the TOOLCHAIN_MAPPINGS list. Please refer to example in @rules_foreign_cc//toolchain_examples. | <code>[]</code> |
| <a id="rules_foreign_cc_dependencies-additional_shell_toolchain_package"></a>additional_shell_toolchain_package | A package under which additional toolchains, referencing the generated data for the passed additonal_shell_toolchain_mappings, will be defined. This value is needed since register_toolchains() is called for these toolchains. Please refer to example in @rules_foreign_cc//toolchain_examples. | <code>None</code> |

View File

@ -1,6 +1,19 @@
"""A module exporting symbols for Stardoc generation."""
load("@rules_foreign_cc//:workspace_definitions.bzl", _rules_foreign_cc_dependencies = "rules_foreign_cc_dependencies")
load(
"@rules_foreign_cc//foreign_cc:defs.bzl",
_boost_build = "boost_build",
_cmake = "cmake",
_configure_make = "configure_make",
_make = "make",
_ninja = "ninja",
)
load(
"@rules_foreign_cc//foreign_cc:providers.bzl",
_ForeignCcArtifact = "ForeignCcArtifact",
_ForeignCcDeps = "ForeignCcDeps",
)
load("@rules_foreign_cc//foreign_cc/built_tools:cmake_build.bzl", _cmake_tool = "cmake_tool")
load("@rules_foreign_cc//foreign_cc/built_tools:make_build.bzl", _make_tool = "make_tool")
load("@rules_foreign_cc//foreign_cc/built_tools:ninja_build.bzl", _ninja_tool = "ninja_tool")
@ -9,19 +22,6 @@ load(
_ToolInfo = "ToolInfo",
_native_tool_toolchain = "native_tool_toolchain",
)
load("@rules_foreign_cc//tools/build_defs:boost_build.bzl", _boost_build = "boost_build")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", _cmake = "cmake")
load("@rules_foreign_cc//tools/build_defs:configure.bzl", _configure_make = "configure_make")
load(
"@rules_foreign_cc//tools/build_defs:framework.bzl",
_ConfigureParameters = "ConfigureParameters",
_ForeignCcArtifact = "ForeignCcArtifact",
_ForeignCcDeps = "ForeignCcDeps",
_InputFiles = "InputFiles",
_WrappedOutputs = "WrappedOutputs",
)
load("@rules_foreign_cc//tools/build_defs:make.bzl", _make = "make")
load("@rules_foreign_cc//tools/build_defs:ninja.bzl", _ninja = "ninja")
# Rules Foreign CC symbols
boost_build = _boost_build
@ -35,20 +35,11 @@ ninja = _ninja
ninja_tool = _ninja_tool
rules_foreign_cc_dependencies = _rules_foreign_cc_dependencies
# buildifier: disable=name-conventions
ConfigureParameters = _ConfigureParameters
# buildifier: disable=name-conventions
ForeignCcArtifact = _ForeignCcArtifact
# buildifier: disable=name-conventions
ForeignCcDeps = _ForeignCcDeps
# buildifier: disable=name-conventions
InputFiles = _InputFiles
# buildifier: disable=name-conventions
WrappedOutputs = _WrappedOutputs
# buildifier: disable=name-conventions
ToolInfo = _ToolInfo

View File

@ -1,6 +1,6 @@
load("@rules_android//android:rules.bzl", "android_binary", "android_library")
load("@rules_cc//cc:defs.bzl", "cc_library")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "libhello",

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_binary", "cc_library")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "srcs",

View File

@ -2,7 +2,7 @@
# for test only
load("@rules_cc//cc:defs.bzl", "cc_binary")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "srcs",

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "lib_a",

View File

@ -1,7 +1,7 @@
# example code is taken from https://github.com/Akagi201/learning-cmake/tree/master/hello-world-lib
# for test only
load("@bazel_tools//tools/build_rules:test_rules.bzl", "file_test")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "srcs",

View File

@ -2,7 +2,7 @@
# for test only
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "srcs",

View File

@ -2,7 +2,7 @@
# for test only
load("@rules_cc//cc:defs.bzl", "cc_binary")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "srcs",

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "hello_world",

View File

@ -1,6 +1,6 @@
load("@bazel_skylib//rules:build_test.bzl", "build_test")
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "liba",

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
# Example of the cmake_external target built with Bazel-built dependency
cmake(

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "cmake_with_data",

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "sources",

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:configure.bzl", "configure_make")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "configure_make")
configure_make(
name = "gperftools_build",

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_library", "cc_test")
load("@rules_foreign_cc//tools/build_defs:configure.bzl", "configure_make")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "configure_make")
cc_library(
name = "built_with_bazel",

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:make.bzl", "make")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "make")
make(
name = "make_lib",

View File

@ -1,5 +1,5 @@
load("@rules_cc//cc:defs.bzl", "cc_test")
load("@rules_foreign_cc//tools/build_defs:ninja.bzl", "ninja")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "ninja")
ninja(
name = "ninja_lib",

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:configure.bzl", "configure_make")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "configure_make")
package(default_visibility = ["//visibility:public"])

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
package(default_visibility = ["//visibility:public"])

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "all_srcs",

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:ninja.bzl", "ninja")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "ninja")
filegroup(
name = "srcs",

View File

@ -1,6 +1,6 @@
"""libiconv is only expected to be used on MacOS systems"""
load("@rules_foreign_cc//tools/build_defs:configure.bzl", "configure_make")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "configure_make")
filegroup(
name = "all",

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "all_srcs",

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
package(default_visibility = ["//visibility:public"])

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
filegroup(
name = "all_srcs",

View File

@ -2,7 +2,7 @@
https://github.com/bazelbuild/rules_foreign_cc/issues/337
"""
load("@rules_foreign_cc//tools/build_defs:configure.bzl", "configure_make")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "configure_make")
# Read https://wiki.openssl.org/index.php/Compilation_and_Installation

View File

@ -1,6 +1,6 @@
"""pcre is only expected to be used on Linux systems"""
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
package(default_visibility = ["//visibility:public"])

View File

@ -1,4 +1,4 @@
load("@rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake")
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
package(default_visibility = ["//visibility:public"])

View File

@ -4,7 +4,7 @@ load(":test_platform_name_rule.bzl", "test_platform_name")
# Targets of type "toolchain" are created inside for the values of ADD_TOOLCHAIN_MAPPINGS,
# with the correct constraints taken from mappings.
# The "toolchain_data" is referencing generated @commands_overloads.
# The "toolchain_data" is referencing generated @rules_foreign_cc_commands_overloads.
register_mappings(
mappings = ADD_TOOLCHAIN_MAPPINGS,
toolchain_type_ = "@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",

View File

@ -2,10 +2,10 @@
(to modify the fragments of generated shell script)
- define your own shell toolchain file(s) by copying @rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains/impl:linux_commands.bzl,
- define your own shell toolchain file(s) by copying @rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains/impl:linux_commands.bzl,
and modifying the methods.
- create a mapping: a list of ToolchainMapping with the mappings between created file(s) and execution or/and target platform constraints.
- in the BUILD file of some package, call "register*mappings" macro from "@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:defs.bzl", passing the mappings and toolchain_type* = "@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands"
- in the BUILD file of some package, call "register*mappings" macro from "@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:defs.bzl", passing the mappings and toolchain_type* = "@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands"
- in the WORKSPACE file of your main repository, when you initialize rules_foreign_cc, pass the mappings and the package, in which BUILD file you called "register_mappings" macro
Please look how it is done in this example.

View File

@ -1,5 +1,6 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:toolchain_mappings.bzl", "ToolchainMapping")
# buildifier: disable=bzl-visibility
load("@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:toolchain_mappings.bzl", "ToolchainMapping")
ADD_TOOLCHAIN_MAPPINGS = [
ToolchainMapping(

View File

@ -1,5 +1,6 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
# buildifier: disable=bzl-visibility
load("@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
_REPLACE_VALUE = "BAZEL_GEN_ROOT"

View File

@ -1,5 +1,6 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl", "os_name")
# buildifier: disable=bzl-visibility
load("@rules_foreign_cc//foreign_cc/private:shell_script_helper.bzl", "os_name")
def _test_platform_name(ctx):
os_name_ = os_name(ctx)
@ -15,5 +16,5 @@ test_platform_name = rule(
attrs = {
"expected": attr.string(),
},
toolchains = ["@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands"],
toolchains = ["@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands"],
)

View File

@ -1,5 +1,8 @@
""" This module is deprecated and has been moved to `//toolchains/built_tools/...` """
load("//foreign_cc/built_tools:cmake_build.bzl", _cmake_tool = "cmake_tool")
load(":deprecation.bzl", "print_deprecation")
print_deprecation()
cmake_tool = _cmake_tool

View File

@ -0,0 +1,10 @@
"""A helper module to inform users this package is deprecated"""
def print_deprecation():
# buildifier: disable=print
print(
"`@rules_foreign_cc//for_workspace/...` is deprecated, please " +
"find the relevant symbols in `@rules_foreign_cc//foreign_cc/built_tools/...`. " +
"Note that the core rules can now be loaded from " +
"`@rules_foreign_cc//foreign_cc:defs.bzl`",
)

View File

@ -1,5 +1,8 @@
""" This module is deprecated and has been moved to `//toolchains/built_tools/...` """
load("//foreign_cc/built_tools:make_build.bzl", _make_tool = "make_tool")
load(":deprecation.bzl", "print_deprecation")
print_deprecation()
make_tool = _make_tool

View File

@ -1,5 +1,8 @@
""" This module is deprecated and has been moved to `//toolchains/built_tools/...` """
load("//foreign_cc/built_tools:ninja_build.bzl", _ninja_tool = "ninja_tool")
load(":deprecation.bzl", "print_deprecation")
print_deprecation()
ninja_tool = _ninja_tool

View File

@ -3,6 +3,9 @@
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
load("@bazel_tools//tools/build_defs/repo:utils.bzl", "maybe")
# buildifier: disable=print
print("@rules_foreign_cc//for_workspace:repositories.bzl is deprecated and will be removed in the near future")
def repositories():
"""Declare repositories used by `rules_foreign_cc`"""

View File

@ -6,5 +6,8 @@ bzl_library(
visibility = ["//:__subpackages__"],
deps = [
"//foreign_cc/built_tools:bzl_srcs",
"//foreign_cc/private:bzl_srcs",
"@bazel_skylib//lib:collections",
"@bazel_skylib//lib:versions",
],
)

View File

@ -0,0 +1,55 @@
""" Rule for building Boost from sources. """
load("//foreign_cc/private:detect_root.bzl", "detect_root")
load(
"//foreign_cc/private:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
def _boost_build(ctx):
attrs = create_attrs(
ctx.attr,
configure_name = "BuildBoost",
create_configure_script = _create_configure_script,
make_commands = ["./b2 install {} --prefix=.".format(" ".join(ctx.attr.user_options))],
)
return cc_external_rule_impl(ctx, attrs)
def _create_configure_script(configureParameters):
ctx = configureParameters.ctx
root = detect_root(ctx.attr.lib_source)
return "\n".join([
"cd $INSTALLDIR",
"##copy_dir_contents_to_dir## $$EXT_BUILD_ROOT$$/{}/. .".format(root),
"./bootstrap.sh {}".format(" ".join(ctx.attr.bootstrap_options)),
])
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"bootstrap_options": attr.string_list(
doc = "any additional flags to pass to bootstrap.sh",
mandatory = False,
),
"user_options": attr.string_list(
doc = "any additional flags to pass to b2",
mandatory = False,
),
})
return attrs
boost_build = rule(
doc = "Rule for building Boost. Invokes bootstrap.sh and then b2 install.",
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _boost_build,
toolchains = [
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)

View File

@ -1,7 +1,7 @@
""" Rule for building CMake from sources. """
load("@rules_foreign_cc//tools/build_defs:detect_root.bzl", "detect_root")
load("@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl", "convert_shell_script")
load("//foreign_cc/private:detect_root.bzl", "detect_root")
load("//foreign_cc/private:shell_script_helper.bzl", "convert_shell_script")
def _cmake_tool(ctx):
root = detect_root(ctx.attr.cmake_srcs)
@ -42,7 +42,7 @@ cmake_tool = rule(
output_to_genfiles = True,
implementation = _cmake_tool,
toolchains = [
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
str(Label("//foreign_cc/private/shell_toolchain/toolchains:shell_commands")),
"@bazel_tools//tools/cpp:toolchain_type",
],
)

View File

@ -1,12 +1,9 @@
""" Rule for building GNU Make from sources. """
load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain")
load(
"@rules_foreign_cc//tools/build_defs:run_shell_file_utils.bzl",
"fictive_file_in_genroot",
)
load("@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl", "convert_shell_script")
load("//tools/build_defs:detect_root.bzl", "detect_root")
load("//foreign_cc/private:detect_root.bzl", "detect_root")
load("//foreign_cc/private:run_shell_file_utils.bzl", "fictive_file_in_genroot")
load("//foreign_cc/private:shell_script_helper.bzl", "convert_shell_script")
def _make_tool(ctx):
root = detect_root(ctx.attr.make_srcs)
@ -56,7 +53,7 @@ make_tool = rule(
output_to_genfiles = True,
implementation = _make_tool,
toolchains = [
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)

View File

@ -1,7 +1,7 @@
""" Rule for building Ninja from sources. """
load("@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl", "convert_shell_script")
load("//tools/build_defs:detect_root.bzl", "detect_root")
load("//foreign_cc/private:detect_root.bzl", "detect_root")
load("//foreign_cc/private:shell_script_helper.bzl", "convert_shell_script")
def _ninja_tool(ctx):
root = detect_root(ctx.attr.ninja_srcs)
@ -36,7 +36,7 @@ ninja_tool = rule(
output_to_genfiles = True,
implementation = _ninja_tool,
toolchains = [
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)

165
foreign_cc/cmake.bzl Normal file
View File

@ -0,0 +1,165 @@
""" Defines the rule for building external library with CMake
"""
load(
"//foreign_cc/private:cc_toolchain_util.bzl",
"get_flags_info",
"get_tools_info",
"is_debug_mode",
)
load("//foreign_cc/private:cmake_script.bzl", "create_cmake_script")
load(
"//foreign_cc/private:detect_root.bzl",
"detect_root",
)
load(
"//foreign_cc/private:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load(
"//toolchains/native_tools:tool_access.bzl",
"get_cmake_data",
"get_make_data",
"get_ninja_data",
)
def _cmake_impl(ctx):
cmake_data = get_cmake_data(ctx)
make_data = get_make_data(ctx)
tools_deps = ctx.attr.tools_deps + cmake_data.deps + make_data.deps
ninja_data = get_ninja_data(ctx)
make_commands = ctx.attr.make_commands
if _uses_ninja(ctx.attr.make_commands):
tools_deps += ninja_data.deps
make_commands = [command.replace("ninja", ninja_data.path) for command in make_commands]
attrs = create_attrs(
ctx.attr,
configure_name = "CMake",
create_configure_script = _create_configure_script,
postfix_script = "##copy_dir_contents_to_dir## $$BUILD_TMPDIR$$/$$INSTALL_PREFIX$$ $$INSTALLDIR$$\n" + ctx.attr.postfix_script,
tools_deps = tools_deps,
cmake_path = cmake_data.path,
ninja_path = ninja_data.path,
make_path = make_data.path,
make_commands = make_commands,
)
return cc_external_rule_impl(ctx, attrs)
def _uses_ninja(make_commands):
for command in make_commands:
(before, separator, after) = command.partition(" ")
if before == "ninja":
return True
return False
def _create_configure_script(configureParameters):
ctx = configureParameters.ctx
inputs = configureParameters.inputs
root = detect_root(ctx.attr.lib_source)
if len(ctx.attr.working_directory) > 0:
root = root + "/" + ctx.attr.working_directory
tools = get_tools_info(ctx)
# CMake will replace <TARGET> with the actual output file
flags = get_flags_info(ctx, "<TARGET>")
no_toolchain_file = ctx.attr.cache_entries.get("CMAKE_TOOLCHAIN_FILE") or not ctx.attr.generate_crosstool_file
define_install_prefix = "export INSTALL_PREFIX=\"" + _get_install_prefix(ctx) + "\"\n"
configure_script = create_cmake_script(
workspace_name = ctx.workspace_name,
cmake_path = configureParameters.attrs.cmake_path,
tools = tools,
flags = flags,
install_prefix = "$$INSTALL_PREFIX$$",
root = root,
no_toolchain_file = no_toolchain_file,
user_cache = dict(ctx.attr.cache_entries),
user_env = dict(ctx.attr.env_vars),
options = ctx.attr.cmake_options,
include_dirs = inputs.include_dirs,
is_debug_mode = is_debug_mode(ctx),
)
return define_install_prefix + configure_script
def _get_install_prefix(ctx):
if ctx.attr.install_prefix:
return ctx.attr.install_prefix
if ctx.attr.lib_name:
return ctx.attr.lib_name
return ctx.attr.name
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"cache_entries": attr.string_dict(
doc = (
"CMake cache entries to initialize (they will be passed with -Dkey=value) " +
"Values, defined by the toolchain, will be joined with the values, passed here. " +
"(Toolchain values come first)"
),
mandatory = False,
default = {},
),
"cmake_options": attr.string_list(
doc = "Other CMake options",
mandatory = False,
default = [],
),
"env_vars": attr.string_dict(
doc = (
"CMake environment variable values to join with toolchain-defined. " +
"For example, additional CXXFLAGS."
),
mandatory = False,
default = {},
),
"generate_crosstool_file": attr.bool(
doc = (
"When True, CMake crosstool file will be generated from the toolchain values, " +
"provided cache-entries and env_vars (some values will still be passed as -Dkey=value " +
"and environment variables). " +
"If CMAKE_TOOLCHAIN_FILE cache entry is passed, specified crosstool file will be used " +
"When using this option to cross-compile, it is required to specify CMAKE_SYSTEM_NAME in the " +
"cache_entries"
),
mandatory = False,
default = True,
),
"install_prefix": attr.string(
doc = "Relative install prefix to be passed to CMake in -DCMAKE_INSTALL_PREFIX",
mandatory = False,
),
"working_directory": attr.string(
doc = (
"Working directory, with the main CMakeLists.txt " +
"(otherwise, the top directory of the lib_source label files is used.)"
),
mandatory = False,
default = "",
),
})
return attrs
cmake = rule(
doc = "Rule for building external library with CMake.",
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _cmake_impl,
toolchains = [
"@rules_foreign_cc//toolchains:cmake_toolchain",
"@rules_foreign_cc//toolchains:ninja_toolchain",
"@rules_foreign_cc//toolchains:make_toolchain",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)

184
foreign_cc/configure.bzl Normal file
View File

@ -0,0 +1,184 @@
# buildifier: disable=module-docstring
load(
"//foreign_cc/private:cc_toolchain_util.bzl",
"get_flags_info",
"get_tools_info",
"is_debug_mode",
)
load("//foreign_cc/private:configure_script.bzl", "create_configure_script")
load("//foreign_cc/private:detect_root.bzl", "detect_root")
load(
"//foreign_cc/private:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load("//foreign_cc/private:shell_script_helper.bzl", "os_name")
load("//toolchains/native_tools:tool_access.bzl", "get_make_data")
def _configure_make(ctx):
make_data = get_make_data(ctx)
tools_deps = ctx.attr.tools_deps + make_data.deps
copy_results = "##copy_dir_contents_to_dir## $$BUILD_TMPDIR$$/$$INSTALL_PREFIX$$ $$INSTALLDIR$$\n"
attrs = create_attrs(
ctx.attr,
configure_name = "Configure",
create_configure_script = _create_configure_script,
postfix_script = copy_results + "\n" + ctx.attr.postfix_script,
tools_deps = tools_deps,
make_path = make_data.path,
)
return cc_external_rule_impl(ctx, attrs)
def _create_configure_script(configureParameters):
ctx = configureParameters.ctx
inputs = configureParameters.inputs
root = detect_root(ctx.attr.lib_source)
install_prefix = _get_install_prefix(ctx)
tools = get_tools_info(ctx)
flags = get_flags_info(ctx)
define_install_prefix = "export INSTALL_PREFIX=\"" + _get_install_prefix(ctx) + "\"\n"
configure = create_configure_script(
workspace_name = ctx.workspace_name,
# as default, pass execution OS as target OS
target_os = os_name(ctx),
tools = tools,
flags = flags,
root = root,
user_options = ctx.attr.configure_options,
user_vars = dict(ctx.attr.configure_env_vars),
is_debug = is_debug_mode(ctx),
configure_command = ctx.attr.configure_command,
deps = ctx.attr.deps,
inputs = inputs,
configure_in_place = ctx.attr.configure_in_place,
autoconf = ctx.attr.autoconf,
autoconf_options = ctx.attr.autoconf_options,
autoconf_env_vars = ctx.attr.autoconf_env_vars,
autoreconf = ctx.attr.autoreconf,
autoreconf_options = ctx.attr.autoreconf_options,
autoreconf_env_vars = ctx.attr.autoreconf_env_vars,
autogen = ctx.attr.autogen,
autogen_command = ctx.attr.autogen_command,
autogen_options = ctx.attr.autogen_options,
autogen_env_vars = ctx.attr.autogen_env_vars,
)
return "\n".join([define_install_prefix, configure])
def _get_install_prefix(ctx):
if ctx.attr.install_prefix:
return ctx.attr.install_prefix
if ctx.attr.lib_name:
return ctx.attr.lib_name
return ctx.attr.name
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"autoconf": attr.bool(
mandatory = False,
default = False,
doc = (
"Set to True if 'autoconf' should be invoked before 'configure', " +
"currently requires 'configure_in_place' to be True."
),
),
"autoconf_env_vars": attr.string_dict(
doc = "Environment variables to be set for 'autoconf' invocation.",
),
"autoconf_options": attr.string_list(
doc = "Any options to be put in the 'autoconf.sh' command line.",
),
"autogen": attr.bool(
doc = (
"Set to True if 'autogen.sh' should be invoked before 'configure', " +
"currently requires 'configure_in_place' to be True."
),
mandatory = False,
default = False,
),
"autogen_command": attr.string(
doc = (
"The name of the autogen script file, default: autogen.sh. " +
"Many projects use autogen.sh however the Autotools FAQ recommends bootstrap " +
"so we provide this option to support that."
),
default = "autogen.sh",
),
"autogen_env_vars": attr.string_dict(
doc = "Environment variables to be set for 'autogen' invocation.",
),
"autogen_options": attr.string_list(
doc = "Any options to be put in the 'autogen.sh' command line.",
),
"autoreconf": attr.bool(
doc = (
"Set to True if 'autoreconf' should be invoked before 'configure.', " +
"currently requires 'configure_in_place' to be True."
),
mandatory = False,
default = False,
),
"autoreconf_env_vars": attr.string_dict(
doc = "Environment variables to be set for 'autoreconf' invocation.",
),
"autoreconf_options": attr.string_list(
doc = "Any options to be put in the 'autoreconf.sh' command line.",
),
"configure_command": attr.string(
doc = (
"The name of the configuration script file, default: configure. " +
"The file must be in the root of the source directory."
),
default = "configure",
),
"configure_env_vars": attr.string_dict(
doc = "Environment variables to be set for the 'configure' invocation.",
),
"configure_in_place": attr.bool(
doc = (
"Set to True if 'configure' should be invoked in place, i.e. from its enclosing " +
"directory."
),
mandatory = False,
default = False,
),
"configure_options": attr.string_list(
doc = "Any options to be put on the 'configure' command line.",
),
"install_prefix": attr.string(
doc = (
"Install prefix, i.e. relative path to where to install the result of the build. " +
"Passed to the 'configure' script with --prefix flag."
),
mandatory = False,
),
})
return attrs
configure_make = rule(
doc = (
"Rule for building external libraries with configure-make pattern. " +
"Some 'configure' script is invoked with --prefix=install (by default), " +
"and other parameters for compilation and linking, taken from Bazel C/C++ " +
"toolchain and passed dependencies. " +
"After configuration, GNU Make is called."
),
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _configure_make,
toolchains = [
"@rules_foreign_cc//toolchains:make_toolchain",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)

13
foreign_cc/defs.bzl Normal file
View File

@ -0,0 +1,13 @@
"""Public entry point to all Foreign CC rules and supported APIs."""
load(":boost_build.bzl", _boost_build = "boost_build")
load(":cmake.bzl", _cmake = "cmake")
load(":configure.bzl", _configure_make = "configure_make")
load(":make.bzl", _make = "make")
load(":ninja.bzl", _ninja = "ninja")
boost_build = _boost_build
cmake = _cmake
configure_make = _configure_make
make = _make
ninja = _ninja

130
foreign_cc/make.bzl Normal file
View File

@ -0,0 +1,130 @@
# buildifier: disable=module-docstring
load(
"//foreign_cc/private:cc_toolchain_util.bzl",
"get_flags_info",
"get_tools_info",
)
load("//foreign_cc/private:configure_script.bzl", "create_make_script")
load(
"//foreign_cc/private:detect_root.bzl",
"detect_root",
)
load(
"//foreign_cc/private:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load("//toolchains/native_tools:tool_access.bzl", "get_make_data")
def _make(ctx):
make_data = get_make_data(ctx)
tools_deps = ctx.attr.tools_deps + make_data.deps
attrs = create_attrs(
ctx.attr,
configure_name = "GNUMake",
create_configure_script = _create_make_script,
tools_deps = tools_deps,
make_path = make_data.path,
make_commands = [],
)
return cc_external_rule_impl(ctx, attrs)
def _create_make_script(configureParameters):
ctx = configureParameters.ctx
inputs = configureParameters.inputs
root = detect_root(ctx.attr.lib_source)
install_prefix = _get_install_prefix(ctx)
tools = get_tools_info(ctx)
flags = get_flags_info(ctx)
make_commands = ctx.attr.make_commands or [
"{make} {keep_going} -C $$EXT_BUILD_ROOT$$/{root}".format(
make = configureParameters.attrs.make_path,
keep_going = "-k" if ctx.attr.keep_going else "",
root = root,
),
"{make} -C $$EXT_BUILD_ROOT$$/{root} install PREFIX={prefix}".format(
make = configureParameters.attrs.make_path,
root = root,
prefix = install_prefix,
),
]
return create_make_script(
workspace_name = ctx.workspace_name,
tools = tools,
flags = flags,
root = root,
user_vars = dict(ctx.attr.make_env_vars),
deps = ctx.attr.deps,
inputs = inputs,
make_commands = make_commands,
prefix = install_prefix,
)
def _get_install_prefix(ctx):
if ctx.attr.prefix:
return ctx.attr.prefix
return "$$INSTALLDIR$$"
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"keep_going": attr.bool(
doc = (
"Keep going when some targets can not be made, -k flag is passed to make " +
"(applies only if make_commands attribute is not set). " +
"Please have a look at _create_make_script for default make_commands."
),
mandatory = False,
default = True,
),
"make_commands": attr.string_list(
doc = (
"Overriding make_commands default value to be empty, " +
"then we can provide better default value programmatically "
),
mandatory = False,
default = [],
),
"make_env_vars": attr.string_dict(
doc = "Environment variables to be set for the 'configure' invocation.",
),
"prefix": attr.string(
doc = (
"Install prefix, an absolute path. " +
"Passed to the GNU make via \"make install PREFIX=<value>\". " +
"By default, the install directory created under sandboxed execution root is used. " +
"Build results are copied to the Bazel's output directory, so the prefix is only important " +
"if it is recorded into any text files by Makefile script. " +
"In that case, it is important to note that rules_foreign_cc is overriding the paths under " +
"execution root with \"BAZEL_GEN_ROOT\" value."
),
mandatory = False,
),
})
return attrs
make = rule(
doc = (
"Rule for building external libraries with GNU Make. " +
"GNU Make commands (make and make install by default) are invoked with prefix=\"install\" " +
"(by default), and other environment variables for compilation and linking, taken from Bazel C/C++ " +
"toolchain and passed dependencies."
),
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _make,
toolchains = [
"@rules_foreign_cc//toolchains:make_toolchain",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)

125
foreign_cc/ninja.bzl Normal file
View File

@ -0,0 +1,125 @@
"""A module defining the `ninja` rule. A rule for building projects using the Ninja build tool"""
load(
"//foreign_cc/private:detect_root.bzl",
"detect_root",
)
load(
"//foreign_cc/private:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load("//toolchains/native_tools:tool_access.bzl", "get_ninja_data")
def _ninja_impl(ctx):
"""The implementation of the `ninja` rule
Args:
ctx (ctx): The rule's context object
Returns:
list: A list of providers. See `cc_external_rule_impl`
"""
ninja_data = get_ninja_data(ctx)
tools_deps = ctx.attr.tools_deps + ninja_data.deps
attrs = create_attrs(
ctx.attr,
configure_name = "Ninja",
create_configure_script = _create_ninja_script,
tools_deps = tools_deps,
ninja_path = ninja_data.path,
make_commands = [],
)
return cc_external_rule_impl(ctx, attrs)
def _create_ninja_script(configureParameters):
"""Creates the bash commands for invoking commands to build ninja projects
Args:
configureParameters (struct): See `ConfigureParameters`
Returns:
str: A string representing a section of a bash script
"""
ctx = configureParameters.ctx
script = []
root = detect_root(ctx.attr.lib_source)
script.append("##symlink_contents_to_dir## $$EXT_BUILD_ROOT$$/{} $$BUILD_TMPDIR$$".format(root))
data = ctx.attr.data or list()
# Generate a list of arguments for ninja
args = " ".join([
ctx.expand_location(arg, data)
for arg in ctx.attr.args
])
# Set the directory location for the build commands
directory = "$$EXT_BUILD_ROOT$$/{}".format(root)
if ctx.attr.directory:
directory = ctx.expand_location(ctx.attr.directory, data)
# Generate commands for all the targets, ensuring there's
# always at least 1 call to the default target.
for target in ctx.attr.targets or [""]:
# Note that even though directory is always passed, the
# following arguments can take precedence.
script.append("{ninja} -C {dir} {args} {target}".format(
ninja = configureParameters.attrs.ninja_path,
dir = directory,
args = args,
target = target,
))
return "\n".join(script)
def _attrs():
"""Modifies the common set of attributes used by rules_foreign_cc and sets Ninja specific attrs
Returns:
dict: Attributes of the `ninja` rule
"""
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
# Drop old vars
attrs.pop("make_commands")
attrs.update({
"args": attr.string_list(
doc = "A list of arguments to pass to the call to `ninja`",
),
"directory": attr.string(
doc = (
"A directory to pass as the `-C` argument. The rule will always use the root " +
"directory of the `lib_sources` attribute if this attribute is not set"
),
),
"targets": attr.string_list(
doc = (
"A list of ninja targets to build. To call the default target, simply pass `\"\"` as " +
"one of the items to this attribute."
),
),
})
return attrs
ninja = rule(
doc = (
"Rule for building external libraries with [Ninja](https://ninja-build.org/)."
),
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _ninja_impl,
toolchains = [
"@rules_foreign_cc//toolchains:ninja_toolchain",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)

View File

@ -0,0 +1,11 @@
load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
bzl_library(
name = "bzl_srcs",
srcs = glob(["**/*.bzl"]),
visibility = ["//:__subpackages__"],
deps = [
"//foreign_cc/private/shell_toolchain/polymorphism:bzl_srcs",
"//foreign_cc/private/shell_toolchain/toolchains:bzl_srcs",
],
)

View File

@ -0,0 +1,419 @@
""" Defines create_linking_info, which wraps passed libraries into CcLinkingInfo
"""
load("@bazel_skylib//lib:collections.bzl", "collections")
load(
"@bazel_tools//tools/build_defs/cc:action_names.bzl",
"ASSEMBLE_ACTION_NAME",
"CPP_COMPILE_ACTION_NAME",
"CPP_LINK_DYNAMIC_LIBRARY_ACTION_NAME",
"CPP_LINK_EXECUTABLE_ACTION_NAME",
"CPP_LINK_STATIC_LIBRARY_ACTION_NAME",
"C_COMPILE_ACTION_NAME",
)
load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain")
LibrariesToLinkInfo = provider(
doc = "Libraries to be wrapped into CcLinkingInfo",
fields = dict(
static_libraries = "Static library files, optional",
shared_libraries = "Shared library files, optional",
interface_libraries = "Interface library files, optional",
),
)
CxxToolsInfo = provider(
doc = "Paths to the C/C++ tools, taken from the toolchain",
fields = dict(
cc = "C compiler",
cxx = "C++ compiler",
cxx_linker_static = "C++ linker to link static library",
cxx_linker_executable = "C++ linker to link executable",
),
)
CxxFlagsInfo = provider(
doc = "Flags for the C/C++ tools, taken from the toolchain",
fields = dict(
cc = "C compiler flags",
cxx = "C++ compiler flags",
cxx_linker_shared = "C++ linker flags when linking shared library",
cxx_linker_static = "C++ linker flags when linking static library",
cxx_linker_executable = "C++ linker flags when linking executable",
assemble = "Assemble flags",
),
)
def _to_list(element):
if element == None:
return []
else:
return [element]
def _to_depset(element):
if element == None:
return depset()
return depset(element)
def _configure_features(ctx, cc_toolchain):
return cc_common.configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
requested_features = ctx.features,
unsupported_features = ctx.disabled_features,
)
def _create_libraries_to_link(ctx, files):
libs = []
static_map = _files_map(_filter(files.static_libraries or [], _is_position_independent, True))
pic_static_map = _files_map(_filter(files.static_libraries or [], _is_position_independent, False))
shared_map = _files_map(files.shared_libraries or [])
interface_map = _files_map(files.interface_libraries or [])
names = collections.uniq(static_map.keys() + pic_static_map.keys() + shared_map.keys() + interface_map.keys())
cc_toolchain = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
)
for name_ in names:
libs.append(cc_common.create_library_to_link(
actions = ctx.actions,
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain,
static_library = static_map.get(name_),
pic_static_library = pic_static_map.get(name_),
dynamic_library = shared_map.get(name_),
interface_library = interface_map.get(name_),
alwayslink = ctx.attr.alwayslink,
))
return depset(direct = libs)
def _is_position_independent(file):
return file.basename.endswith(".pic.a")
def _filter(list_, predicate, inverse):
result = []
for elem in list_:
check = predicate(elem)
if not inverse and check or inverse and not check:
result.append(elem)
return result
def _files_map(files_list):
by_names_map = {}
for file_ in files_list:
name_ = _file_name_no_ext(file_.basename)
value = by_names_map.get(name_)
if value:
fail("Can not have libraries with the same name in the same category")
by_names_map[name_] = file_
return by_names_map
def _defines_from_deps(ctx):
return depset(transitive = [dep[CcInfo].compilation_context.defines for dep in ctx.attr.deps])
def _build_cc_link_params(
ctx,
user_link_flags,
static_libraries,
dynamic_libraries,
runtime_artifacts):
static_shared = None
static_no_shared = None
if static_libraries != None and len(static_libraries) > 0:
static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(static_libraries),
)
static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(static_libraries),
)
else:
static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
no_static_shared = None
no_static_no_shared = None
if dynamic_libraries != None and len(dynamic_libraries) > 0:
no_static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
no_static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
else:
no_static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(static_libraries),
)
no_static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(static_libraries),
)
return {
"dynamic_mode_params_for_dynamic_library": no_static_shared,
"dynamic_mode_params_for_executable": no_static_no_shared,
"static_mode_params_for_dynamic_library": static_shared,
"static_mode_params_for_executable": static_no_shared,
}
def targets_windows(ctx, cc_toolchain):
"""Returns true if build is targeting Windows
Args:
ctx: rule context
cc_toolchain: optional - Cc toolchain
"""
toolchain = cc_toolchain if cc_toolchain else find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = toolchain,
)
return cc_common.is_enabled(
feature_configuration = feature_configuration,
feature_name = "targets_windows",
)
def create_linking_info(ctx, user_link_flags, files):
"""Creates CcLinkingInfo for the passed user link options and libraries.
Args:
ctx (ctx): rule context
user_link_flags (list of strings): link optins, provided by user
files (LibrariesToLink): provider with the library files
"""
return cc_common.create_linking_context(
linker_inputs = depset(direct = [
cc_common.create_linker_input(
owner = ctx.label,
libraries = _create_libraries_to_link(ctx, files),
user_link_flags = depset(direct = user_link_flags),
),
]),
)
# buildifier: disable=function-docstring
def get_env_vars(ctx):
cc_toolchain = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
)
copts = ctx.attr.copts if hasattr(ctx.attr, "copts") else []
vars = dict()
for action_name in [C_COMPILE_ACTION_NAME, CPP_LINK_STATIC_LIBRARY_ACTION_NAME, CPP_LINK_EXECUTABLE_ACTION_NAME]:
vars.update(cc_common.get_environment_variables(
feature_configuration = feature_configuration,
action_name = action_name,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain,
user_compile_flags = copts,
),
))
return vars
def is_debug_mode(ctx):
# Compilation mode currently defaults to fastbuild. Use that if for some reason the variable is not set
# https://docs.bazel.build/versions/master/command-line-reference.html#flag--compilation_mode
return ctx.var.get("COMPILATION_MODE", "fastbuild") == "dbg"
def get_tools_info(ctx):
"""Takes information about tools paths from cc_toolchain, returns CxxToolsInfo
Args:
ctx: rule context
"""
cc_toolchain = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
)
return CxxToolsInfo(
cc = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = C_COMPILE_ACTION_NAME,
),
cxx = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = CPP_COMPILE_ACTION_NAME,
),
cxx_linker_static = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME,
),
cxx_linker_executable = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = CPP_LINK_EXECUTABLE_ACTION_NAME,
),
)
def get_flags_info(ctx, link_output_file = None):
"""Takes information about flags from cc_toolchain, returns CxxFlagsInfo
Args:
ctx: rule context
link_output_file: output file to be specified in the link command line
flags
Returns:
CxxFlagsInfo: A provider containing Cxx flags
"""
cc_toolchain_ = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain_,
)
copts = (ctx.fragments.cpp.copts + ctx.fragments.cpp.conlyopts) or []
cxxopts = (ctx.fragments.cpp.copts + ctx.fragments.cpp.cxxopts) or []
linkopts = ctx.fragments.cpp.linkopts or []
defines = _defines_from_deps(ctx)
flags = CxxFlagsInfo(
cc = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = C_COMPILE_ACTION_NAME,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain_,
preprocessor_defines = defines,
),
),
cxx = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_COMPILE_ACTION_NAME,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain_,
preprocessor_defines = defines,
add_legacy_cxx_options = True,
),
),
cxx_linker_shared = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_LINK_DYNAMIC_LIBRARY_ACTION_NAME,
variables = cc_common.create_link_variables(
cc_toolchain = cc_toolchain_,
feature_configuration = feature_configuration,
is_using_linker = True,
is_linking_dynamic_library = True,
),
),
cxx_linker_static = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME,
variables = cc_common.create_link_variables(
cc_toolchain = cc_toolchain_,
feature_configuration = feature_configuration,
is_using_linker = False,
is_linking_dynamic_library = False,
output_file = link_output_file,
),
),
cxx_linker_executable = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_LINK_EXECUTABLE_ACTION_NAME,
variables = cc_common.create_link_variables(
cc_toolchain = cc_toolchain_,
feature_configuration = feature_configuration,
is_using_linker = True,
is_linking_dynamic_library = False,
),
),
assemble = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = ASSEMBLE_ACTION_NAME,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain_,
preprocessor_defines = defines,
),
),
)
return CxxFlagsInfo(
cc = _add_if_needed(flags.cc, copts),
cxx = _add_if_needed(flags.cxx, cxxopts),
cxx_linker_shared = _add_if_needed(flags.cxx_linker_shared, linkopts),
cxx_linker_static = flags.cxx_linker_static,
cxx_linker_executable = _add_if_needed(flags.cxx_linker_executable, linkopts),
assemble = _add_if_needed(flags.assemble, copts),
)
def _add_if_needed(arr, add_arr):
filtered = []
for to_add in add_arr:
found = False
for existing in arr:
if existing == to_add:
found = True
if not found:
filtered.append(to_add)
return arr + filtered
def absolutize_path_in_str(workspace_name, root_str, text, force = False):
"""Replaces relative paths in [the middle of] 'text', prepending them with 'root_str'. If there is nothing to replace, returns the 'text'.
We only will replace relative paths starting with either 'external/' or '<top-package-name>/',
because we only want to point with absolute paths to external repositories or inside our
current workspace. (And also to limit the possibility of error with such not exact replacing.)
Args:
workspace_name: workspace name
text: the text to do replacement in
root_str: the text to prepend to the found relative path
force: If true, the `root_str` will always be prepended
Returns:
string: A formatted string
"""
new_text = _prefix(text, "external/", root_str)
if new_text == text:
new_text = _prefix(text, workspace_name + "/", root_str)
# absolutize relative by adding our working directory
# this works because we ru on windows under msys now
if force and new_text == text and not text.startswith("/"):
new_text = root_str + "/" + text
return new_text
def _prefix(text, from_str, prefix):
text = text.replace('"', '\\"')
(before, middle, after) = text.partition(from_str)
if not middle or before.endswith("/"):
return text
return before + prefix + middle + after
def _file_name_no_ext(basename):
(before, separator, after) = basename.rpartition(".")
return before

View File

@ -0,0 +1,312 @@
""" Contains all logic for calling CMake for building external libraries/binaries """
load(":cc_toolchain_util.bzl", "absolutize_path_in_str")
def create_cmake_script(
workspace_name,
cmake_path,
tools,
flags,
install_prefix,
root,
no_toolchain_file,
user_cache,
user_env,
options,
include_dirs = [],
is_debug_mode = True):
"""Constructs CMake script to be passed to cc_external_rule_impl.
Args:
workspace_name: current workspace name
cmake_path: The path to the cmake executable
tools: cc_toolchain tools (CxxToolsInfo)
flags: cc_toolchain flags (CxxFlagsInfo)
install_prefix: value ot pass to CMAKE_INSTALL_PREFIX
root: sources root relative to the $EXT_BUILD_ROOT
no_toolchain_file: if False, CMake toolchain file will be generated, otherwise not
user_cache: dictionary with user's values of cache initializers
user_env: dictionary with user's values for CMake environment variables
options: other CMake options specified by user
include_dirs: Optional additional include directories. Defaults to [].
is_debug_mode: If the compilation mode is `debug`. Defaults to True.
Returns:
string: A formatted string of the generated build command
"""
merged_prefix_path = _merge_prefix_path(user_cache, include_dirs)
toolchain_dict = _fill_crossfile_from_toolchain(workspace_name, tools, flags)
params = None
keys_with_empty_values_in_user_cache = [key for key in user_cache if user_cache.get(key) == ""]
if no_toolchain_file:
params = _create_cache_entries_env_vars(toolchain_dict, user_cache, user_env)
else:
params = _create_crosstool_file_text(toolchain_dict, user_cache, user_env)
build_type = params.cache.get(
"CMAKE_BUILD_TYPE",
"Debug" if is_debug_mode else "Release",
)
params.cache.update({
"CMAKE_BUILD_TYPE": build_type,
"CMAKE_INSTALL_PREFIX": install_prefix,
"CMAKE_PREFIX_PATH": merged_prefix_path,
})
# Give user the ability to suppress some value, taken from Bazel's toolchain,
# or to suppress calculated CMAKE_BUILD_TYPE
# If the user passes "CMAKE_BUILD_TYPE": "" (empty string),
# CMAKE_BUILD_TYPE will not be passed to CMake
_wipe_empty_values(params.cache, keys_with_empty_values_in_user_cache)
# However, if no CMAKE_RANLIB was passed, pass the empty value for it explicitly,
# as it is legacy and autodetection of ranlib made by CMake automatically
# breaks some cross compilation builds,
# see https://github.com/envoyproxy/envoy/pull/6991
if not params.cache.get("CMAKE_RANLIB"):
params.cache.update({"CMAKE_RANLIB": ""})
set_env_vars = " ".join([key + "=\"" + params.env[key] + "\"" for key in params.env])
str_cmake_cache_entries = " ".join(["-D" + key + "=\"" + params.cache[key] + "\"" for key in params.cache])
cmake_call = " ".join([
set_env_vars,
cmake_path,
str_cmake_cache_entries,
" ".join(options),
"$EXT_BUILD_ROOT/" + root,
])
return "\n".join(params.commands + [cmake_call])
def _wipe_empty_values(cache, keys_with_empty_values_in_user_cache):
for key in keys_with_empty_values_in_user_cache:
if cache.get(key) != None:
cache.pop(key)
# From CMake documentation: ;-list of directories specifying installation prefixes to be searched...
def _merge_prefix_path(user_cache, include_dirs):
user_prefix = user_cache.get("CMAKE_PREFIX_PATH")
values = ["$EXT_BUILD_DEPS"] + include_dirs
if user_prefix != None:
# remove it, it is gonna be merged specifically
user_cache.pop("CMAKE_PREFIX_PATH")
values.append(user_prefix.strip("\"'"))
return ";".join(values)
_CMAKE_ENV_VARS_FOR_CROSSTOOL = {
"ASMFLAGS": struct(value = "CMAKE_ASM_FLAGS_INIT", replace = False),
"CC": struct(value = "CMAKE_C_COMPILER", replace = True),
"CFLAGS": struct(value = "CMAKE_C_FLAGS_INIT", replace = False),
"CXX": struct(value = "CMAKE_CXX_COMPILER", replace = True),
"CXXFLAGS": struct(value = "CMAKE_CXX_FLAGS_INIT", replace = False),
}
_CMAKE_CACHE_ENTRIES_CROSSTOOL = {
"CMAKE_AR": struct(value = "CMAKE_AR", replace = True),
"CMAKE_ASM_FLAGS": struct(value = "CMAKE_ASM_FLAGS_INIT", replace = False),
"CMAKE_CXX_ARCHIVE_CREATE": struct(value = "CMAKE_CXX_ARCHIVE_CREATE", replace = False),
"CMAKE_CXX_FLAGS": struct(value = "CMAKE_CXX_FLAGS_INIT", replace = False),
"CMAKE_CXX_LINK_EXECUTABLE": struct(value = "CMAKE_CXX_LINK_EXECUTABLE", replace = True),
"CMAKE_C_ARCHIVE_CREATE": struct(value = "CMAKE_C_ARCHIVE_CREATE", replace = False),
"CMAKE_C_FLAGS": struct(value = "CMAKE_C_FLAGS_INIT", replace = False),
"CMAKE_EXE_LINKER_FLAGS": struct(value = "CMAKE_EXE_LINKER_FLAGS_INIT", replace = False),
"CMAKE_RANLIB": struct(value = "CMAKE_RANLIB", replace = True),
"CMAKE_SHARED_LINKER_FLAGS": struct(value = "CMAKE_SHARED_LINKER_FLAGS_INIT", replace = False),
"CMAKE_STATIC_LINKER_FLAGS": struct(value = "CMAKE_STATIC_LINKER_FLAGS_INIT", replace = False),
}
def _create_crosstool_file_text(toolchain_dict, user_cache, user_env):
cache_entries = _dict_copy(user_cache)
env_vars = _dict_copy(user_env)
_move_dict_values(toolchain_dict, env_vars, _CMAKE_ENV_VARS_FOR_CROSSTOOL)
_move_dict_values(toolchain_dict, cache_entries, _CMAKE_CACHE_ENTRIES_CROSSTOOL)
lines = []
for key in toolchain_dict:
if ("CMAKE_AR" == key):
lines.append("set({} \"{}\" {})".format(key, toolchain_dict[key], "CACHE FILEPATH \"Archiver\""))
continue
lines.append("set({} \"{}\")".format(key, toolchain_dict[key]))
cache_entries.update({
"CMAKE_TOOLCHAIN_FILE": "crosstool_bazel.cmake",
})
return struct(
commands = ["cat > crosstool_bazel.cmake <<EOF\n" + "\n".join(sorted(lines)) + "\nEOF\n"],
env = env_vars,
cache = cache_entries,
)
def _dict_copy(d):
out = {}
if d:
out.update(d)
return out
def _create_cache_entries_env_vars(toolchain_dict, user_cache, user_env):
_move_dict_values(toolchain_dict, user_env, _CMAKE_ENV_VARS_FOR_CROSSTOOL)
_move_dict_values(toolchain_dict, user_cache, _CMAKE_CACHE_ENTRIES_CROSSTOOL)
merged_env = _translate_from_toolchain_file(toolchain_dict, _CMAKE_ENV_VARS_FOR_CROSSTOOL)
merged_cache = _translate_from_toolchain_file(toolchain_dict, _CMAKE_CACHE_ENTRIES_CROSSTOOL)
# anything left in user's env_entries does not correspond to anything defined by toolchain
# => simple merge
merged_env.update(user_env)
merged_cache.update(user_cache)
return struct(
commands = [],
env = merged_env,
cache = merged_cache,
)
def _translate_from_toolchain_file(toolchain_dict, descriptor_map):
reverse = _reverse_descriptor_dict(descriptor_map)
cl_keyed_toolchain = dict()
keys = toolchain_dict.keys()
for key in keys:
env_var_key = reverse.get(key)
if env_var_key:
cl_keyed_toolchain[env_var_key.value] = toolchain_dict.pop(key)
return cl_keyed_toolchain
def _merge_toolchain_and_user_values(toolchain_dict, user_dict, descriptor_map):
_move_dict_values(toolchain_dict, user_dict, descriptor_map)
cl_keyed_toolchain = _translate_from_toolchain_file(toolchain_dict, descriptor_map)
# anything left in user's env_entries does not correspond to anything defined by toolchain
# => simple merge
cl_keyed_toolchain.update(user_dict)
return cl_keyed_toolchain
def _reverse_descriptor_dict(dict):
out_dict = {}
for key in dict:
value = dict[key]
out_dict[value.value] = struct(value = key, replace = value.replace)
return out_dict
def _move_dict_values(target, source, descriptor_map):
keys = source.keys()
for key in keys:
existing = descriptor_map.get(key)
if existing:
value = source.pop(key)
if existing.replace or target.get(existing.value) == None:
target[existing.value] = value
else:
target[existing.value] = target[existing.value] + " " + value
def _fill_crossfile_from_toolchain(workspace_name, tools, flags):
dict = {}
_sysroot = _find_in_cc_or_cxx(flags, "sysroot")
if _sysroot:
dict["CMAKE_SYSROOT"] = _absolutize(workspace_name, _sysroot)
_ext_toolchain_cc = _find_flag_value(flags.cc, "gcc_toolchain")
if _ext_toolchain_cc:
dict["CMAKE_C_COMPILER_EXTERNAL_TOOLCHAIN"] = _absolutize(workspace_name, _ext_toolchain_cc)
_ext_toolchain_cxx = _find_flag_value(flags.cxx, "gcc_toolchain")
if _ext_toolchain_cxx:
dict["CMAKE_CXX_COMPILER_EXTERNAL_TOOLCHAIN"] = _absolutize(workspace_name, _ext_toolchain_cxx)
# Force convert tools paths to absolute using $EXT_BUILD_ROOT
if tools.cc:
dict["CMAKE_C_COMPILER"] = _absolutize(workspace_name, tools.cc, True)
if tools.cxx:
dict["CMAKE_CXX_COMPILER"] = _absolutize(workspace_name, tools.cxx, True)
if tools.cxx_linker_static:
dict["CMAKE_AR"] = _absolutize(workspace_name, tools.cxx_linker_static, True)
if tools.cxx_linker_static.endswith("/libtool"):
dict["CMAKE_C_ARCHIVE_CREATE"] = "<CMAKE_AR> %s <OBJECTS>" % \
" ".join(flags.cxx_linker_static)
dict["CMAKE_CXX_ARCHIVE_CREATE"] = "<CMAKE_AR> %s <OBJECTS>" % \
" ".join(flags.cxx_linker_static)
if tools.cxx_linker_executable and tools.cxx_linker_executable != tools.cxx:
normalized_path = _absolutize(workspace_name, tools.cxx_linker_executable)
dict["CMAKE_CXX_LINK_EXECUTABLE"] = " ".join([
normalized_path,
"<FLAGS>",
"<CMAKE_CXX_LINK_FLAGS>",
"<LINK_FLAGS>",
"<OBJECTS>",
"-o <TARGET>",
"<LINK_LIBRARIES>",
])
if flags.cc:
dict["CMAKE_C_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cc)
if flags.cxx:
dict["CMAKE_CXX_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cxx)
if flags.assemble:
dict["CMAKE_ASM_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.assemble)
# todo this options are needed, but cause a bug because the flags are put in wrong order => keep this line
# if flags.cxx_linker_static:
# lines += [_set_list(ctx, "CMAKE_STATIC_LINKER_FLAGS_INIT", flags.cxx_linker_static)]
if flags.cxx_linker_shared:
dict["CMAKE_SHARED_LINKER_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cxx_linker_shared)
if flags.cxx_linker_executable:
dict["CMAKE_EXE_LINKER_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cxx_linker_executable)
return dict
def _find_in_cc_or_cxx(flags, flag_name_no_dashes):
_value = _find_flag_value(flags.cxx, flag_name_no_dashes)
if _value:
return _value
return _find_flag_value(flags.cc, flag_name_no_dashes)
def _find_flag_value(list, flag_name_no_dashes):
one_dash = "-" + flag_name_no_dashes.lstrip(" ")
two_dash = "--" + flag_name_no_dashes.lstrip(" ")
check_for_value = False
for value in list:
value = value.lstrip(" ")
if check_for_value:
return value.lstrip(" =")
_tail = _tail_if_starts_with(value, one_dash)
_tail = _tail_if_starts_with(value, two_dash) if _tail == None else _tail
if _tail != None and len(_tail) > 0:
return _tail.lstrip(" =")
if _tail != None:
check_for_value = True
return None
def _tail_if_starts_with(str, start):
if (str.startswith(start)):
return str[len(start):]
return None
def _absolutize(workspace_name, text, force = False):
if text.strip(" ").startswith("C:") or text.strip(" ").startswith("c:"):
return text
return absolutize_path_in_str(workspace_name, "$EXT_BUILD_ROOT/", text, force)
def _join_flags_list(workspace_name, flags):
return " ".join([_absolutize(workspace_name, flag) for flag in flags])
export_for_test = struct(
absolutize = _absolutize,
tail_if_starts_with = _tail_if_starts_with,
find_flag_value = _find_flag_value,
fill_crossfile_from_toolchain = _fill_crossfile_from_toolchain,
move_dict_values = _move_dict_values,
reverse_descriptor_dict = _reverse_descriptor_dict,
merge_toolchain_and_user_values = _merge_toolchain_and_user_values,
CMAKE_ENV_VARS_FOR_CROSSTOOL = _CMAKE_ENV_VARS_FOR_CROSSTOOL,
CMAKE_CACHE_ENTRIES_CROSSTOOL = _CMAKE_CACHE_ENTRIES_CROSSTOOL,
)

View File

@ -0,0 +1,238 @@
# buildifier: disable=module-docstring
load(":cc_toolchain_util.bzl", "absolutize_path_in_str")
load(":framework.bzl", "get_foreign_cc_dep")
def _pkgconfig_script(ext_build_dirs):
"""Create a script fragment to configure pkg-config"""
script = []
if ext_build_dirs:
for ext_dir in ext_build_dirs:
script.append("##increment_pkg_config_path## $$EXT_BUILD_DEPS$$/" + ext_dir.basename)
script.append("echo \"PKG_CONFIG_PATH=$${PKG_CONFIG_PATH:-}$$\"")
script.append("##define_absolute_paths## $$EXT_BUILD_DEPS$$ $$EXT_BUILD_DEPS$$")
return script
# buildifier: disable=function-docstring
def create_configure_script(
workspace_name,
target_os,
tools,
flags,
root,
user_options,
user_vars,
is_debug,
configure_command,
deps,
inputs,
configure_in_place,
autoconf,
autoconf_options,
autoconf_env_vars,
autoreconf,
autoreconf_options,
autoreconf_env_vars,
autogen,
autogen_command,
autogen_options,
autogen_env_vars):
env_vars_string = get_env_vars(workspace_name, tools, flags, user_vars, deps, inputs)
ext_build_dirs = inputs.ext_build_dirs
script = _pkgconfig_script(ext_build_dirs)
root_path = "$$EXT_BUILD_ROOT$$/{}".format(root)
configure_path = "{}/{}".format(root_path, configure_command)
if configure_in_place:
script.append("##symlink_contents_to_dir## $$EXT_BUILD_ROOT$$/{} $$BUILD_TMPDIR$$".format(root))
root_path = "$$BUILD_TMPDIR$$"
configure_path = "{}/{}".format(root_path, configure_command)
if autogen and configure_in_place:
# NOCONFIGURE is pseudo standard and tells the script to not invoke configure.
# We explicitly invoke configure later.
autogen_env_vars = _get_autogen_env_vars(autogen_env_vars)
script.append("{} \"{}/{}\" {}".format(
" ".join(["{}=\"{}\"".format(key, autogen_env_vars[key]) for key in autogen_env_vars]),
root_path,
autogen_command,
" ".join(autogen_options),
).lstrip())
if autoconf and configure_in_place:
script.append("{} autoconf {}".format(
" ".join(["{}=\"{}\"".format(key, autoconf_env_vars[key]) for key in autoconf_env_vars]),
" ".join(autoconf_options),
).lstrip())
if autoreconf and configure_in_place:
script.append("{} autoreconf {}".format(
" ".join(["{}=\"{}\"".format(key, autoreconf_env_vars[key]) for key in autoreconf_env_vars]),
" ".join(autoreconf_options),
).lstrip())
script.append("{env_vars} \"{configure}\" --prefix=$$BUILD_TMPDIR$$/$$INSTALL_PREFIX$$ {user_options}".format(
env_vars = env_vars_string,
configure = configure_path,
user_options = " ".join(user_options),
))
return "\n".join(script)
# buildifier: disable=function-docstring
def create_make_script(
workspace_name,
tools,
flags,
root,
user_vars,
deps,
inputs,
make_commands,
prefix):
env_vars_string = get_env_vars(workspace_name, tools, flags, user_vars, deps, inputs)
ext_build_dirs = inputs.ext_build_dirs
script = _pkgconfig_script(ext_build_dirs)
script.append("##symlink_contents_to_dir## $$EXT_BUILD_ROOT$$/{} $$BUILD_TMPDIR$$".format(root))
script.extend(make_commands)
return "\n".join(script)
# buildifier: disable=function-docstring
def get_env_vars(
workspace_name,
tools,
flags,
user_vars,
deps,
inputs):
vars = _get_configure_variables(tools, flags, user_vars)
deps_flags = _define_deps_flags(deps, inputs)
if "LDFLAGS" in vars.keys():
vars["LDFLAGS"] = vars["LDFLAGS"] + deps_flags.libs
else:
vars["LDFLAGS"] = deps_flags.libs
# -I flags should be put into preprocessor flags, CPPFLAGS
# https://www.gnu.org/software/autoconf/manual/autoconf-2.63/html_node/Preset-Output-Variables.html
vars["CPPFLAGS"] = deps_flags.flags
return " ".join(["{}=\"{}\""
.format(key, _join_flags_list(workspace_name, vars[key])) for key in vars])
def _get_autogen_env_vars(autogen_env_vars):
# Make a copy if necessary so we can set NOCONFIGURE.
if autogen_env_vars.get("NOCONFIGURE"):
return autogen_env_vars
vars = {}
for key in autogen_env_vars:
vars[key] = autogen_env_vars.get(key)
vars["NOCONFIGURE"] = "1"
return vars
def _define_deps_flags(deps, inputs):
# It is very important to keep the order for the linker => put them into list
lib_dirs = []
# Here go libraries built with Bazel
gen_dirs_set = {}
for lib in inputs.libs:
dir_ = lib.dirname
if not gen_dirs_set.get(dir_):
gen_dirs_set[dir_] = 1
lib_dirs.append("-L$$EXT_BUILD_ROOT$$/" + dir_)
include_dirs_set = {}
for include_dir in inputs.include_dirs:
include_dirs_set[include_dir] = "-I$$EXT_BUILD_ROOT$$/" + include_dir
for header in inputs.headers:
include_dir = header.dirname
if not include_dirs_set.get(include_dir):
include_dirs_set[include_dir] = "-I$$EXT_BUILD_ROOT$$/" + include_dir
include_dirs = include_dirs_set.values()
# For the external libraries, we need to refer to the places where
# we copied the dependencies ($EXT_BUILD_DEPS/<lib_name>), because
# we also want configure to find that same files with pkg-config
# -config or other mechanics.
# Since we need the names of include and lib directories under
# the $EXT_BUILD_DEPS/<lib_name>, we ask the provider.
gen_dirs_set = {}
for dep in deps:
external_deps = get_foreign_cc_dep(dep)
if external_deps:
for artifact in external_deps.artifacts.to_list():
if not gen_dirs_set.get(artifact.gen_dir):
gen_dirs_set[artifact.gen_dir] = 1
dir_name = artifact.gen_dir.basename
include_dirs.append("-I$$EXT_BUILD_DEPS$$/{}/{}".format(dir_name, artifact.include_dir_name))
lib_dirs.append("-L$$EXT_BUILD_DEPS$$/{}/{}".format(dir_name, artifact.lib_dir_name))
return struct(
libs = lib_dirs,
flags = include_dirs,
)
# See https://www.gnu.org/software/make/manual/html_node/Implicit-Variables.html
_CONFIGURE_FLAGS = {
"ARFLAGS": "cxx_linker_static",
"ASFLAGS": "assemble",
"CFLAGS": "cc",
"CXXFLAGS": "cxx",
"LDFLAGS": "cxx_linker_executable",
# missing: cxx_linker_shared
}
_CONFIGURE_TOOLS = {
"AR": "cxx_linker_static",
"CC": "cc",
"CXX": "cxx",
# missing: cxx_linker_executable
}
def _get_configure_variables(tools, flags, user_env_vars):
vars = {}
for flag in _CONFIGURE_FLAGS:
flag_value = getattr(flags, _CONFIGURE_FLAGS[flag])
if flag_value:
vars[flag] = flag_value
# Merge flags lists
for user_var in user_env_vars:
toolchain_val = vars.get(user_var)
if toolchain_val:
vars[user_var] = toolchain_val + [user_env_vars[user_var]]
tools_dict = {}
for tool in _CONFIGURE_TOOLS:
tool_value = getattr(tools, _CONFIGURE_TOOLS[tool])
if tool_value:
tools_dict[tool] = [tool_value]
# Replace tools paths if user passed other values
for user_var in user_env_vars:
toolchain_val = tools_dict.get(user_var)
if toolchain_val:
tools_dict[user_var] = [user_env_vars[user_var]]
vars.update(tools_dict)
# Put all other environment variables, passed by the user
for user_var in user_env_vars:
if not vars.get(user_var):
vars[user_var] = [user_env_vars[user_var]]
return vars
def _absolutize(workspace_name, text):
return absolutize_path_in_str(workspace_name, "$$EXT_BUILD_ROOT$$/", text)
def _join_flags_list(workspace_name, flags):
return " ".join([_absolutize(workspace_name, flag) for flag in flags])

View File

@ -0,0 +1,85 @@
# buildifier: disable=module-docstring
# buildifier: disable=function-docstring-header
def detect_root(source):
"""Detects the path to the topmost directory of the 'source' outputs.
To be used with external build systems to point to the source code/tools directories.
Args:
source (Target): A filegroup of source files
Returns:
string: The relative path to the root source directory
"""
sources = source.files.to_list()
if len(sources) == 0:
return ""
root = None
level = -1
# find topmost directory
for file in sources:
file_level = _get_level(file.path)
# If there is no level set or the current file's level
# is greather than what we have logged, update the root
if level == -1 or level > file_level:
root = file
level = file_level
if not root:
fail("No root source or directory was found")
if root.is_source:
return root.dirname
# Note this code path will never be hit due to a bug upstream Bazel
# https://github.com/bazelbuild/bazel/issues/12954
# If the root is not a source file, it must be a directory.
# Thus the path is returned
return root.path
def _get_level(path):
"""Determine the number of sub directories `path` is contained in
Args:
path (string): The target path
Returns:
int: The directory depth of `path`
"""
normalized = path
# This for loop ensures there are no double `//` substrings.
# A for loop is used because there's not currently a `while`
# or a better mechanism for guaranteeing all `//` have been
# cleaned up.
for i in range(len(path)):
new_normalized = normalized.replace("//", "/")
if len(new_normalized) == len(normalized):
break
normalized = new_normalized
return normalized.count("/")
# buildifier: disable=function-docstring-header
# buildifier: disable=function-docstring-args
# buildifier: disable=function-docstring-return
def filter_containing_dirs_from_inputs(input_files_list):
"""When the directories are also passed in the filegroup with the sources,
we get into a situation when we have containing in the sources list,
which is not allowed by Bazel (execroot creation code fails).
The parent directories will be created for us in the execroot anyway,
so we filter them out."""
# This puts directories in front of their children in list
sorted_list = sorted(input_files_list)
contains_map = {}
for input in input_files_list:
# If the immediate parent directory is already in the list, remove it
if contains_map.get(input.dirname):
contains_map.pop(input.dirname)
contains_map[input.path] = input
return contains_map.values()

View File

@ -0,0 +1,872 @@
""" Contains definitions for creation of external C/C++ build rules (for building external libraries
with CMake, configure/make, autotools)
"""
load("@bazel_skylib//lib:collections.bzl", "collections")
load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain")
load("//foreign_cc:providers.bzl", "ForeignCcArtifact", "ForeignCcDeps")
load("//foreign_cc/private:detect_root.bzl", "detect_root", "filter_containing_dirs_from_inputs")
load(
":cc_toolchain_util.bzl",
"LibrariesToLinkInfo",
"create_linking_info",
"get_env_vars",
"targets_windows",
)
load(
":run_shell_file_utils.bzl",
"copy_directory",
"fictive_file_in_genroot",
)
load(
":shell_script_helper.bzl",
"convert_shell_script",
"create_function",
"os_name",
)
# Dict with definitions of the context attributes, that customize cc_external_rule_impl function.
# Many of the attributes have default values.
#
# Typically, the concrete external library rule will use this structure to create the attributes
# description dict. See cmake.bzl as an example.
#
CC_EXTERNAL_RULE_ATTRIBUTES = {
"additional_inputs": attr.label_list(
doc = (
"Optional additional inputs to be declared as needed for the shell script action." +
"Not used by the shell script part in cc_external_rule_impl."
),
mandatory = False,
allow_files = True,
default = [],
),
"additional_tools": attr.label_list(
doc = (
"Optional additional tools needed for the building. " +
"Not used by the shell script part in cc_external_rule_impl."
),
mandatory = False,
allow_files = True,
default = [],
),
"alwayslink": attr.bool(
doc = (
"Optional. if true, link all the object files from the static library, " +
"even if they are not used."
),
mandatory = False,
default = False,
),
"binaries": attr.string_list(
doc = "__deprecated__: Use `out_binaries` instead.",
),
"data": attr.label_list(
doc = "Files needed by this rule at runtime. May list file or rule targets. Generally allows any target.",
mandatory = False,
allow_files = True,
default = [],
),
"defines": attr.string_list(
doc = (
"Optional compilation definitions to be passed to the dependencies of this library. " +
"They are NOT passed to the compiler, you should duplicate them in the configuration options."
),
mandatory = False,
default = [],
),
"deps": attr.label_list(
doc = (
"Optional dependencies to be copied into the directory structure. " +
"Typically those directly required for the external building of the library/binaries. " +
"(i.e. those that the external buidl system will be looking for and paths to which are " +
"provided by the calling rule)"
),
mandatory = False,
allow_files = True,
default = [],
),
"env": attr.string_dict(
doc = (
"Environment variables to set during the build. " +
"$(execpath) macros may be used to point at files which are listed as data deps, tools_deps, or additional_tools, " +
"but unlike with other rules, these will be replaced with absolute paths to those files, " +
"because the build does not run in the exec root. " +
"No other macros are supported."
),
),
"headers_only": attr.bool(
doc = "__deprecated__: Use `out_headers_only` instead.",
mandatory = False,
default = False,
),
"interface_libraries": attr.string_list(
doc = "__deprecated__: Use `out_interface_libs` instead.",
mandatory = False,
),
"lib_name": attr.string(
doc = (
"Library name. Defines the name of the install directory and the name of the static library, " +
"if no output files parameters are defined (any of static_libraries, shared_libraries, " +
"interface_libraries, binaries_names) " +
"Optional. If not defined, defaults to the target's name."
),
mandatory = False,
),
"lib_source": attr.label(
doc = (
"Label with source code to build. Typically a filegroup for the source of remote repository. " +
"Mandatory."
),
mandatory = True,
allow_files = True,
),
"linkopts": attr.string_list(
doc = "Optional link options to be passed up to the dependencies of this library",
mandatory = False,
default = [],
),
"make_commands": attr.string_list(
doc = "Optinal make commands, defaults to [\"make\", \"make install\"]",
mandatory = False,
default = ["make", "make install"],
),
"out_bin_dir": attr.string(
doc = "Optional name of the output subdirectory with the binary files, defaults to 'bin'.",
mandatory = False,
default = "bin",
),
"out_binaries": attr.string_list(
doc = "Optional names of the resulting binaries.",
mandatory = False,
),
"out_headers_only": attr.bool(
doc = "Flag variable to indicate that the library produces only headers",
mandatory = False,
default = False,
),
"out_include_dir": attr.string(
doc = "Optional name of the output subdirectory with the header files, defaults to 'include'.",
mandatory = False,
default = "include",
),
"out_interface_libs": attr.string_list(
doc = "Optional names of the resulting interface libraries.",
mandatory = False,
),
"out_lib_dir": attr.string(
doc = "Optional name of the output subdirectory with the library files, defaults to 'lib'.",
mandatory = False,
default = "lib",
),
"out_shared_libs": attr.string_list(
doc = "Optional names of the resulting shared libraries.",
mandatory = False,
),
"out_static_libs": attr.string_list(
doc = (
"Optional names of the resulting static libraries. Note that if `out_headers_only`, `out_static_libs`, " +
"`out_shared_libs`, and `out_binaries` are not set, default `lib_name.a`/`lib_name.lib` static " +
"library is assumed"
),
mandatory = False,
),
"postfix_script": attr.string(
doc = "Optional part of the shell script to be added after the make commands",
mandatory = False,
),
"shared_libraries": attr.string_list(
doc = "__deprecated__: Use `out_shared_libs` instead.",
mandatory = False,
),
"static_libraries": attr.string_list(
doc = "__deprecated__: Use `out_static_libs` instead.",
mandatory = False,
),
"tools_deps": attr.label_list(
doc = (
"Optional tools to be copied into the directory structure. " +
"Similar to deps, those directly required for the external building of the library/binaries."
),
mandatory = False,
allow_files = True,
cfg = "host",
default = [],
),
# we need to declare this attribute to access cc_toolchain
"_cc_toolchain": attr.label(
default = Label("@bazel_tools//tools/cpp:current_cc_toolchain"),
),
}
# A list of common fragments required by rules using this framework
CC_EXTERNAL_RULE_FRAGMENTS = [
"cpp",
]
# buildifier: disable=function-docstring-header
# buildifier: disable=function-docstring-args
# buildifier: disable=function-docstring-return
def create_attrs(attr_struct, configure_name, create_configure_script, **kwargs):
"""Function for adding/modifying context attributes struct (originally from ctx.attr),
provided by user, to be passed to the cc_external_rule_impl function as a struct.
Copies a struct 'attr_struct' values (with attributes from CC_EXTERNAL_RULE_ATTRIBUTES)
to the resulting struct, adding or replacing attributes passed in 'configure_name',
'configure_script', and '**kwargs' parameters.
"""
attrs = {}
for key in CC_EXTERNAL_RULE_ATTRIBUTES:
if not key.startswith("_") and hasattr(attr_struct, key):
attrs[key] = getattr(attr_struct, key)
attrs["configure_name"] = configure_name
attrs["create_configure_script"] = create_configure_script
for arg in kwargs:
attrs[arg] = kwargs[arg]
return struct(**attrs)
# buildifier: disable=name-conventions
ConfigureParameters = provider(
doc = """Parameters of create_configure_script callback function, called by
cc_external_rule_impl function. create_configure_script creates the configuration part
of the script, and allows to reuse the inputs structure, created by the framework.""",
fields = dict(
ctx = "Rule context",
attrs = """Attributes struct, created by create_attrs function above""",
inputs = """InputFiles provider: summarized information on rule inputs, created by framework
function, to be reused in script creator. Contains in particular merged compilation and linking
dependencies.""",
),
)
def cc_external_rule_impl(ctx, attrs):
"""Framework function for performing external C/C++ building.
To be used to build external libraries or/and binaries with CMake, configure/make, autotools etc.,
and use results in Bazel.
It is possible to use it to build a group of external libraries, that depend on each other or on
Bazel library, and pass nessesary tools.
Accepts the actual commands for build configuration/execution in attrs.
Creates and runs a shell script, which:
1. prepares directory structure with sources, dependencies, and tools symlinked into subdirectories
of the execroot directory. Adds tools into PATH.
2. defines the correct absolute paths in tools with the script paths, see 7
3. defines the following environment variables:
EXT_BUILD_ROOT: execroot directory
EXT_BUILD_DEPS: subdirectory of execroot, which contains the following subdirectories:
For cmake_external built dependencies:
symlinked install directories of the dependencies
for Bazel built/imported dependencies:
include - here the include directories are symlinked
lib - here the library files are symlinked
lib/pkgconfig - here the pkgconfig files are symlinked
bin - here the tools are copied
INSTALLDIR: subdirectory of the execroot (named by the lib_name), where the library/binary
will be installed
These variables should be used by the calling rule to refer to the created directory structure.
4. calls 'attrs.create_configure_script'
5. calls 'attrs.make_commands'
6. calls 'attrs.postfix_script'
7. replaces absolute paths in possibly created scripts with a placeholder value
Please see cmake.bzl for example usage.
Args:
ctx: calling rule context
attrs: attributes struct, created by create_attrs function above.
Contains fields from CC_EXTERNAL_RULE_ATTRIBUTES (see descriptions there),
two mandatory fields:
- configure_name: name of the configuration tool, to be used in action mnemonic,
- create_configure_script(ConfigureParameters): function that creates configuration
script, accepts ConfigureParameters
and some other fields provided by the rule, which have been passed to create_attrs.
Returns:
A list of providers
"""
lib_name = attrs.lib_name or ctx.attr.name
inputs = _define_inputs(attrs)
outputs = _define_outputs(ctx, attrs, lib_name)
out_cc_info = _define_out_cc_info(ctx, attrs, inputs, outputs)
cc_env = _correct_path_variable(get_env_vars(ctx))
set_cc_envs = ""
execution_os_name = os_name(ctx)
if execution_os_name != "osx":
set_cc_envs = "\n".join(["export {}=\"{}\"".format(key, cc_env[key]) for key in cc_env])
lib_header = "Bazel external C/C++ Rules. Building library '{}'".format(lib_name)
# We can not declare outputs of the action, which are in parent-child relashion,
# so we need to have a (symlinked) copy of the output directory to provide
# both the C/C++ artifacts - libraries, headers, and binaries,
# and the install directory as a whole (which is mostly nessesary for chained external builds).
#
# We want the install directory output of this rule to have the same name as the library,
# so symlink it under the same name but in a subdirectory
installdir_copy = copy_directory(ctx.actions, "$$INSTALLDIR$$", "copy_{}/{}".format(lib_name, lib_name))
# we need this fictive file in the root to get the path of the root in the script
empty = fictive_file_in_genroot(ctx.actions, ctx.label.name)
data_dependencies = ctx.attr.data + ctx.attr.tools_deps + ctx.attr.additional_tools
define_variables = [
set_cc_envs,
"export EXT_BUILD_ROOT=##pwd##",
"export INSTALLDIR=$$EXT_BUILD_ROOT$$/" + empty.file.dirname + "/" + lib_name,
"export BUILD_TMPDIR=$${INSTALLDIR}$$.build_tmpdir",
"export EXT_BUILD_DEPS=$${INSTALLDIR}$$.ext_build_deps",
] + [
"export {key}={value}".format(
key = key,
# Prepend the exec root to each $(execpath ) lookup because the working directory will not be the exec root.
value = ctx.expand_location(value.replace("$(execpath ", "$$EXT_BUILD_ROOT$$/$(execpath "), data_dependencies),
)
for key, value in getattr(ctx.attr, "env", {}).items()
]
make_commands = []
for line in attrs.make_commands:
if line == "make" or line.startswith("make "):
make_commands.append(line.replace("make", attrs.make_path, 1))
else:
make_commands.append(line)
script_lines = [
"##echo## \"\"",
"##echo## \"{}\"".format(lib_header),
"##echo## \"\"",
"##script_prelude##",
"\n".join(define_variables),
"##path## $$EXT_BUILD_ROOT$$",
"##mkdirs## $$INSTALLDIR$$",
"##mkdirs## $$BUILD_TMPDIR$$",
"##mkdirs## $$EXT_BUILD_DEPS$$",
_print_env(),
"\n".join(_copy_deps_and_tools(inputs)),
"cd $$BUILD_TMPDIR$$",
attrs.create_configure_script(ConfigureParameters(ctx = ctx, attrs = attrs, inputs = inputs)),
"\n".join(make_commands),
attrs.postfix_script or "",
# replace references to the root directory when building ($BUILD_TMPDIR)
# and the root where the dependencies were installed ($EXT_BUILD_DEPS)
# for the results which are in $INSTALLDIR (with placeholder)
"##replace_absolute_paths## $$INSTALLDIR$$ $$BUILD_TMPDIR$$",
"##replace_absolute_paths## $$INSTALLDIR$$ $$EXT_BUILD_DEPS$$",
installdir_copy.script,
empty.script,
"cd $$EXT_BUILD_ROOT$$",
]
script_text = "\n".join([
"#!/usr/bin/env bash",
convert_shell_script(ctx, script_lines),
])
wrapped_outputs = wrap_outputs(ctx, lib_name, attrs.configure_name, script_text)
rule_outputs = outputs.declared_outputs + [installdir_copy.file]
cc_toolchain = find_cpp_toolchain(ctx)
execution_requirements = {"block-network": ""}
if "requires-network" in ctx.attr.tags:
execution_requirements = {"requires-network": ""}
# We need to create a native batch script on windows to ensure the wrapper
# script can correctly be envoked with bash.
wrapper = wrapped_outputs.wrapper_script_file
extra_tools = []
if "win" in execution_os_name:
batch_wrapper = ctx.actions.declare_file(wrapper.short_path + ".bat")
ctx.actions.write(
output = batch_wrapper,
content = "\n".join([
"@ECHO OFF",
"START /b /wait bash {}".format(wrapper.path),
"",
]),
is_executable = True,
)
extra_tools.append(wrapper)
wrapper = batch_wrapper
ctx.actions.run(
mnemonic = "Cc" + attrs.configure_name.capitalize() + "MakeRule",
inputs = depset(inputs.declared_inputs),
outputs = rule_outputs + [
empty.file,
wrapped_outputs.log_file,
],
tools = depset(
[wrapped_outputs.script_file] + extra_tools + ctx.files.data + ctx.files.tools_deps + ctx.files.additional_tools,
transitive = [cc_toolchain.all_files] + [data[DefaultInfo].default_runfiles.files for data in data_dependencies],
),
# TODO: Default to never using the default shell environment to make builds more hermetic. For now, every platform
# but MacOS will take the default PATH passed by Bazel, not that from cc_toolchain.
use_default_shell_env = execution_os_name != "osx",
executable = wrapper,
execution_requirements = execution_requirements,
# this is ignored if use_default_shell_env = True
env = cc_env,
)
# Gather runfiles transitively as per the documentation in:
# https://docs.bazel.build/versions/master/skylark/rules.html#runfiles
runfiles = ctx.runfiles(files = ctx.files.data)
for target in [ctx.attr.lib_source] + ctx.attr.additional_inputs + ctx.attr.deps + ctx.attr.data:
runfiles = runfiles.merge(target[DefaultInfo].default_runfiles)
externally_built = ForeignCcArtifact(
gen_dir = installdir_copy.file,
bin_dir_name = attrs.out_bin_dir,
lib_dir_name = attrs.out_lib_dir,
include_dir_name = attrs.out_include_dir,
)
output_groups = _declare_output_groups(installdir_copy.file, outputs.out_binary_files)
wrapped_files = [
wrapped_outputs.script_file,
wrapped_outputs.log_file,
wrapped_outputs.wrapper_script_file,
]
output_groups[attrs.configure_name + "_logs"] = wrapped_files
return [
DefaultInfo(
files = depset(direct = rule_outputs),
runfiles = runfiles,
),
OutputGroupInfo(**output_groups),
ForeignCcDeps(artifacts = depset(
[externally_built],
transitive = _get_transitive_artifacts(attrs.deps),
)),
CcInfo(
compilation_context = out_cc_info.compilation_context,
linking_context = out_cc_info.linking_context,
),
]
# buildifier: disable=name-conventions
WrappedOutputs = provider(
doc = "Structure for passing the log and scripts file information, and wrapper script text.",
fields = {
"log_file": "Execution log file",
"script_file": "Main script file",
"wrapper_script": "Wrapper script text to execute",
"wrapper_script_file": "Wrapper script file (output for debugging purposes)",
},
)
# buildifier: disable=function-docstring
def wrap_outputs(ctx, lib_name, configure_name, script_text):
build_log_file = ctx.actions.declare_file("{}_logs/{}.log".format(lib_name, configure_name))
wrapper_script_file = ctx.actions.declare_file("{}_scripts/{}_wrapper_script.sh".format(lib_name, configure_name))
build_script_file = ctx.actions.declare_file("{}_scripts/{}_script.sh".format(lib_name, configure_name))
ctx.actions.write(
output = build_script_file,
content = script_text,
is_executable = True,
)
cleanup_on_success_function = create_function(
ctx,
"cleanup_on_success",
"rm -rf $BUILD_TMPDIR $EXT_BUILD_DEPS",
)
cleanup_on_failure_function = create_function(
ctx,
"cleanup_on_failure",
"\n".join([
"##echo## \"rules_foreign_cc: Build failed!\"",
"##echo## \"rules_foreign_cc: Keeping temp build directory $$BUILD_TMPDIR$$ and dependencies directory $$EXT_BUILD_DEPS$$ for debug.\"",
"##echo## \"rules_foreign_cc: Please note that the directories inside a sandbox are still cleaned unless you specify '--sandbox_debug' Bazel command line flag.\"",
"##echo## \"rules_foreign_cc: Printing build logs:\"",
"##echo## \"_____ BEGIN BUILD LOGS _____\"",
"##cat## $$BUILD_LOG$$",
"##echo## \"_____ END BUILD LOGS _____\"",
"##echo## \"rules_foreign_cc: Build wrapper script location: $$BUILD_WRAPPER_SCRIPT$$\"",
"##echo## \"rules_foreign_cc: Build script location: $$BUILD_SCRIPT$$\"",
"##echo## \"rules_foreign_cc: Build log location: $$BUILD_LOG$$\"",
"##echo## \"\"",
]),
)
trap_function = "##cleanup_function## cleanup_on_success cleanup_on_failure"
build_command_lines = [
"##assert_script_errors##",
cleanup_on_success_function,
cleanup_on_failure_function,
# the call trap is defined inside, in a way how the shell function should be called
# see, for instance, linux_commands.bzl
trap_function,
"export BUILD_WRAPPER_SCRIPT=\"{}\"".format(wrapper_script_file.path),
"export BUILD_SCRIPT=\"{}\"".format(build_script_file.path),
"export BUILD_LOG=\"{}\"".format(build_log_file.path),
# sometimes the log file is not created, we do not want our script to fail because of this
"##touch## $$BUILD_LOG$$",
"##redirect_out_err## $$BUILD_SCRIPT$$ $$BUILD_LOG$$",
]
build_command = "\n".join([
"#!/usr/bin/env bash",
convert_shell_script(ctx, build_command_lines),
"",
])
ctx.actions.write(
output = wrapper_script_file,
content = build_command,
is_executable = True,
)
return WrappedOutputs(
script_file = build_script_file,
log_file = build_log_file,
wrapper_script_file = wrapper_script_file,
wrapper_script = build_command,
)
def _declare_output_groups(installdir, outputs):
dict_ = {}
dict_["gen_dir"] = depset([installdir])
for output in outputs:
dict_[output.basename] = [output]
return dict_
def _get_transitive_artifacts(deps):
artifacts = []
for dep in deps:
foreign_dep = get_foreign_cc_dep(dep)
if foreign_dep:
artifacts.append(foreign_dep.artifacts)
return artifacts
def _print_env():
return "\n".join([
"##echo## \"Environment:______________\"",
"##env##",
"##echo## \"__________________________\"",
])
def _correct_path_variable(env):
value = env.get("PATH", "")
if not value:
return env
value = env.get("PATH", "").replace("C:\\", "/c/")
value = value.replace("\\", "/")
value = value.replace(";", ":")
env["PATH"] = "$PATH:" + value
return env
def _depset(item):
if item == None:
return depset()
return depset([item])
def _list(item):
if item:
return [item]
return []
def _copy_deps_and_tools(files):
lines = []
lines += _symlink_contents_to_dir("lib", files.libs)
lines += _symlink_contents_to_dir("include", files.headers + files.include_dirs)
if files.tools_files:
lines.append("##mkdirs## $$EXT_BUILD_DEPS$$/bin")
for tool in files.tools_files:
lines.append("##symlink_to_dir## $$EXT_BUILD_ROOT$$/{} $$EXT_BUILD_DEPS$$/bin/".format(tool))
for ext_dir in files.ext_build_dirs:
lines.append("##symlink_to_dir## $$EXT_BUILD_ROOT$$/{} $$EXT_BUILD_DEPS$$".format(_file_path(ext_dir)))
lines.append("##children_to_path## $$EXT_BUILD_DEPS$$/bin")
lines.append("##path## $$EXT_BUILD_DEPS$$/bin")
return lines
def _symlink_contents_to_dir(dir_name, files_list):
# It is possible that some duplicate libraries will be passed as inputs
# to cmake_external or configure_make. Filter duplicates out here.
files_list = collections.uniq(files_list)
if len(files_list) == 0:
return []
lines = ["##mkdirs## $$EXT_BUILD_DEPS$$/" + dir_name]
for file in files_list:
path = _file_path(file).strip()
if path:
lines.append("##symlink_contents_to_dir## \
$$EXT_BUILD_ROOT$$/{} $$EXT_BUILD_DEPS$$/{}".format(path, dir_name))
return lines
def _file_path(file):
return file if type(file) == "string" else file.path
_FORBIDDEN_FOR_FILENAME = ["\\", "/", ":", "*", "\"", "<", ">", "|"]
def _check_file_name(var):
if (len(var) == 0):
fail("Library name cannot be an empty string.")
for index in range(0, len(var) - 1):
letter = var[index]
if letter in _FORBIDDEN_FOR_FILENAME:
fail("Symbol '%s' is forbidden in library name '%s'." % (letter, var))
# buildifier: disable=name-conventions
_Outputs = provider(
doc = "Provider to keep different kinds of the external build output files and directories",
fields = dict(
out_include_dir = "Directory with header files (relative to install directory)",
out_binary_files = "Binary files, which will be created by the action",
libraries = "Library files, which will be created by the action",
declared_outputs = "All output files and directories of the action",
),
)
def _define_outputs(ctx, attrs, lib_name):
attr_binaries_libs = []
attr_headers_only = attrs.out_headers_only
attr_interface_libs = []
attr_shared_libs = []
attr_static_libs = []
# TODO: Until the the deprecated attributes are removed, we must
# create a mutatable list so we can ensure they're being included
attr_binaries_libs.extend(getattr(attrs, "out_binaries", []))
attr_interface_libs.extend(getattr(attrs, "out_interface_libs", []))
attr_shared_libs.extend(getattr(attrs, "out_shared_libs", []))
attr_static_libs.extend(getattr(attrs, "out_static_libs", []))
# TODO: These names are deprecated, remove
if getattr(attrs, "binaries", []):
# buildifier: disable=print
print("The `binaries` attr is deprecated in favor of `out_binaries`. Please update the target `{}`".format(ctx.label))
attr_binaries_libs.extend(getattr(attrs, "binaries", []))
if getattr(attrs, "headers_only", False):
# buildifier: disable=print
print("The `headers_only` attr is deprecated in favor of `out_headers_only`. Please update the target `{}`".format(ctx.label))
attr_headers_only = attrs.headers_only
if getattr(attrs, "interface_libraries", []):
# buildifier: disable=print
print("The `interface_libraries` attr is deprecated in favor of `out_interface_libs`. Please update the target `{}`".format(ctx.label))
attr_interface_libs.extend(getattr(attrs, "interface_libraries", []))
if getattr(attrs, "shared_libraries", []):
# buildifier: disable=print
print("The `shared_libraries` attr is deprecated in favor of `out_shared_libs`. Please update the target `{}`".format(ctx.label))
attr_shared_libs.extend(getattr(attrs, "shared_libraries", []))
if getattr(attrs, "static_libraries", []):
# buildifier: disable=print
print("The `static_libraries` attr is deprecated in favor of `out_static_libs`. Please update the target `{}`".format(ctx.label))
attr_static_libs.extend(getattr(attrs, "static_libraries", []))
static_libraries = []
if not attr_headers_only:
if not attr_static_libs and not attr_shared_libs and not attr_binaries_libs and not attr_interface_libs:
static_libraries = [lib_name + (".lib" if targets_windows(ctx, None) else ".a")]
else:
static_libraries = attr_static_libs
_check_file_name(lib_name)
out_include_dir = ctx.actions.declare_directory(lib_name + "/" + attrs.out_include_dir)
out_binary_files = _declare_out(ctx, lib_name, attrs.out_bin_dir, attr_binaries_libs)
libraries = LibrariesToLinkInfo(
static_libraries = _declare_out(ctx, lib_name, attrs.out_lib_dir, static_libraries),
shared_libraries = _declare_out(ctx, lib_name, attrs.out_lib_dir, attr_shared_libs),
interface_libraries = _declare_out(ctx, lib_name, attrs.out_lib_dir, attr_interface_libs),
)
declared_outputs = [out_include_dir] + out_binary_files
declared_outputs += libraries.static_libraries
declared_outputs += libraries.shared_libraries + libraries.interface_libraries
return _Outputs(
out_include_dir = out_include_dir,
out_binary_files = out_binary_files,
libraries = libraries,
declared_outputs = declared_outputs,
)
def _declare_out(ctx, lib_name, dir_, files):
if files and len(files) > 0:
return [ctx.actions.declare_file("/".join([lib_name, dir_, file])) for file in files]
return []
# buildifier: disable=name-conventions
InputFiles = provider(
doc = (
"Provider to keep different kinds of input files, directories, " +
"and C/C++ compilation and linking info from dependencies"
),
fields = dict(
headers = "Include files built by Bazel. Will be copied into $EXT_BUILD_DEPS/include.",
include_dirs = (
"Include directories built by Bazel. Will be copied " +
"into $EXT_BUILD_DEPS/include."
),
libs = "Library files built by Bazel. Will be copied into $EXT_BUILD_DEPS/lib.",
tools_files = (
"Files and directories with tools needed for configuration/building " +
"to be copied into the bin folder, which is added to the PATH"
),
ext_build_dirs = (
"Directories with libraries, built by framework function. " +
"This directories should be copied into $EXT_BUILD_DEPS/lib-name as is, with all contents."
),
deps_compilation_info = "Merged CcCompilationInfo from deps attribute",
deps_linking_info = "Merged CcLinkingInfo from deps attribute",
declared_inputs = "All files and directories that must be declared as action inputs",
),
)
def _define_inputs(attrs):
cc_infos = []
bazel_headers = []
bazel_system_includes = []
bazel_libs = []
# This framework function-built libraries: copy result directories under
# $EXT_BUILD_DEPS/lib-name
ext_build_dirs = []
for dep in attrs.deps:
external_deps = get_foreign_cc_dep(dep)
cc_infos.append(dep[CcInfo])
if external_deps:
ext_build_dirs += [artifact.gen_dir for artifact in external_deps.artifacts.to_list()]
else:
headers_info = _get_headers(dep[CcInfo].compilation_context)
bazel_headers += headers_info.headers
bazel_system_includes += headers_info.include_dirs
bazel_libs += _collect_libs(dep[CcInfo].linking_context)
# Keep the order of the transitive foreign dependencies
# (the order is important for the correct linking),
# but filter out repeating directories
ext_build_dirs = uniq_list_keep_order(ext_build_dirs)
tools_roots = []
tools_files = []
input_files = []
for tool in attrs.tools_deps:
tool_root = detect_root(tool)
tools_roots.append(tool_root)
for file_list in tool.files.to_list():
tools_files += _list(file_list)
for tool in attrs.additional_tools:
for file_list in tool.files.to_list():
tools_files += _list(file_list)
for input in attrs.additional_inputs:
for file_list in input.files.to_list():
input_files += _list(file_list)
# These variables are needed for correct C/C++ providers constraction,
# they should contain all libraries and include directories.
cc_info_merged = cc_common.merge_cc_infos(cc_infos = cc_infos)
return InputFiles(
headers = bazel_headers,
include_dirs = bazel_system_includes,
libs = bazel_libs,
tools_files = tools_roots,
deps_compilation_info = cc_info_merged.compilation_context,
deps_linking_info = cc_info_merged.linking_context,
ext_build_dirs = ext_build_dirs,
declared_inputs = filter_containing_dirs_from_inputs(attrs.lib_source.files.to_list()) +
bazel_libs +
tools_files +
input_files +
cc_info_merged.compilation_context.headers.to_list() +
ext_build_dirs,
)
# buildifier: disable=function-docstring
def uniq_list_keep_order(list):
result = []
contains_map = {}
for item in list:
if contains_map.get(item):
continue
contains_map[item] = 1
result.append(item)
return result
def get_foreign_cc_dep(dep):
return dep[ForeignCcDeps] if ForeignCcDeps in dep else None
# consider optimization here to do not iterate both collections
def _get_headers(compilation_info):
include_dirs = compilation_info.system_includes.to_list() + \
compilation_info.includes.to_list()
# do not use quote includes, currently they do not contain
# library-specific information
include_dirs = collections.uniq(include_dirs)
headers = []
for header in compilation_info.headers.to_list():
path = header.path
included = False
for dir_ in include_dirs:
if path.startswith(dir_):
included = True
break
if not included:
headers.append(header)
return struct(
headers = headers,
include_dirs = include_dirs,
)
def _define_out_cc_info(ctx, attrs, inputs, outputs):
compilation_info = cc_common.create_compilation_context(
headers = depset([outputs.out_include_dir]),
system_includes = depset([outputs.out_include_dir.path]),
includes = depset([]),
quote_includes = depset([]),
defines = depset(attrs.defines),
)
linking_info = create_linking_info(ctx, attrs.linkopts, outputs.libraries)
cc_info = CcInfo(
compilation_context = compilation_info,
linking_context = linking_info,
)
inputs_info = CcInfo(
compilation_context = inputs.deps_compilation_info,
linking_context = inputs.deps_linking_info,
)
return cc_common.merge_cc_infos(cc_infos = [cc_info, inputs_info])
def _extract_libraries(library_to_link):
return [
library_to_link.static_library,
library_to_link.pic_static_library,
library_to_link.dynamic_library,
library_to_link.interface_library,
]
def _collect_libs(cc_linking):
libs = []
for li in cc_linking.linker_inputs.to_list():
for library_to_link in li.libraries:
for library in _extract_libraries(library_to_link):
if library:
libs.append(library)
return collections.uniq(libs)

View File

@ -0,0 +1,48 @@
# buildifier: disable=module-docstring
# buildifier: disable=name-conventions
CreatedByScript = provider(
doc = "Structure to keep declared file or directory and creating script.",
fields = dict(
file = "Declared file or directory",
script = "Script that creates that file or directory",
),
)
def fictive_file_in_genroot(actions, target_name):
"""Creates a fictive file under the build root.
This gives the possibility to address the build root in script and construct paths under it.
Args:
actions (ctx.actions): actions factory
target_name (ctx.label.name): name of the current target
"""
# we need this fictive file in the genroot to get the path of the root in the script
empty = actions.declare_file("empty_{}.txt".format(target_name))
return CreatedByScript(
file = empty,
script = "##touch## $$EXT_BUILD_ROOT$$/" + empty.path,
)
def copy_directory(actions, orig_path, copy_path):
"""Copies directory by $EXT_BUILD_ROOT/orig_path into to $EXT_BUILD_ROOT/copy_path.
I.e. a copy of the directory is created under $EXT_BUILD_ROOT/copy_path.
Args:
actions: actions factory (ctx.actions)
orig_path: path to the original directory, relative to the build root
copy_path: target directory, relative to the build root
"""
dir_copy = actions.declare_directory(copy_path)
return CreatedByScript(
file = dir_copy,
script = "\n".join([
"##mkdirs## $$EXT_BUILD_ROOT$$/" + dir_copy.path,
"##copy_dir_contents_to_dir## {} $$EXT_BUILD_ROOT$$/{}".format(
orig_path,
dir_copy.path,
),
]),
)

View File

@ -0,0 +1,197 @@
"""Contains functions for conversion from intermediate multiplatform notation for defining
the shell script into the actual shell script for the concrete platform.
Notation:
1. `export <varname>=<value>`
Define the environment variable with the name <varname> and value <value>.
If the <value> contains the toolchain command call (see 3), the call is replaced with needed value.
2. `$$<varname>$$`
Refer the environment variable with the name <varname>,
i.e. this will become $<varname> on Linux/MacOS, and %<varname>% on Windows.
3. `##<funname>## <arg1> ... <argn>`
Find the shell toolchain command Starlark method with the name <funname> for that command
in a toolchain, and call it, passing <arg1> .. <argn>.
(see ./shell_toolchain/commands.bzl, ./shell_toolchain/impl/linux_commands.bzl etc.)
The arguments are space-separated; if the argument is quoted, the spaces inside the quites are
ignored.
! Escaping of the quotes inside the quoted argument is not supported, as it was not needed for now.
(quoted arguments are used for paths and never for any arbitrary string.)
The call of a shell toolchain Starlark method is performed through
//foreign_cc/private/shell_toolchain/toolchains:access.bzl; please refer there for the details.
Here what is important is that the Starlark method can also add some text (function definitions)
into a "prelude" part of the shell_context.
The resulting script is constructed from the prelude part with function definitions and
the actual translated script part.
Since function definitions can call other functions, we perform the fictive translation
of the function bodies to populate the "prelude" part of the script.
"""
load("//foreign_cc/private/shell_toolchain/toolchains:access.bzl", "call_shell", "create_context")
load("//foreign_cc/private/shell_toolchain/toolchains:commands.bzl", "PLATFORM_COMMANDS")
def os_name(ctx):
return call_shell(create_context(ctx), "os_name")
def create_function(ctx, name, text):
return call_shell(create_context(ctx), "define_function", name, text)
def convert_shell_script(ctx, script):
""" Converts shell script from the intermediate notation to actual schell script.
Please see the file header for the notation description.
Args:
ctx: rule context
script: the array of script strings, each string can be of multiple lines
Returns:
the string with the shell script for the current execution platform
"""
return convert_shell_script_by_context(create_context(ctx), script)
# buildifier: disable=function-docstring
def convert_shell_script_by_context(shell_context, script):
# 0. Split in lines merged fragments.
new_script = []
for fragment in script:
new_script += fragment.splitlines()
script = new_script
# 1. Call the functions or replace export statements.
script = [do_function_call(line, shell_context) for line in script]
# 2. Make sure functions calls are replaced.
# (it is known there is no deep recursion, do it only once)
script = [do_function_call(line, shell_context) for line in script]
# 3. Same for function bodies.
#
# Since we have some function bodies containing calls to other functions,
# we need to replace calls to the new functions and add the text
# of those functions to shell_context.prelude several times,
# and 4 times is enough for our toolchain.
# Example of such function: 'symlink_contents_to_dir'.
processed_prelude = {}
for i in range(1, 4):
for key in shell_context.prelude.keys():
text = shell_context.prelude[key]
lines = text.splitlines()
replaced = "\n".join([
do_function_call(line.strip(" "), shell_context)
for line in lines
])
processed_prelude[key] = replaced
for key in processed_prelude.keys():
shell_context.prelude[key] = processed_prelude[key]
script = shell_context.prelude.values() + script
# 4. replace all variable references
script = [replace_var_ref(line, shell_context) for line in script]
result = "\n".join(script)
return result
# buildifier: disable=function-docstring
def replace_var_ref(text, shell_context):
parts = []
current = text
# long enough
for i in range(1, 100):
(before, varname, after) = extract_wrapped(current, "$$")
if not varname:
parts.append(current)
break
parts.append(before)
parts.append(shell_context.shell.use_var(varname))
current = after
return "".join(parts)
# buildifier: disable=function-docstring
def replace_exports(text, shell_context):
text = text.strip(" ")
(varname, separator, value) = text.partition("=")
if not separator:
fail("Wrong export declaration")
(funname, after) = get_function_name(value.strip(" "))
if funname:
value = call_shell(shell_context, funname, *split_arguments(after.strip(" ")))
return call_shell(shell_context, "export_var", varname, value)
# buildifier: disable=function-docstring
def get_function_name(text):
(funname, separator, after) = text.partition(" ")
if funname == "export":
return (funname, after)
(before, funname_extracted, after_extracted) = extract_wrapped(funname, "##", "##")
if funname_extracted and PLATFORM_COMMANDS.get(funname_extracted):
if len(before) > 0 or len(after_extracted) > 0:
fail("Something wrong with the shell command call notation: " + text)
return (funname_extracted, after)
return (None, None)
# buildifier: disable=function-docstring
def extract_wrapped(text, prefix, postfix = None):
postfix = postfix or prefix
(before, separator, after) = text.partition(prefix)
if not separator or not after:
return (text, None, None)
(varname, separator2, after2) = after.partition(postfix)
if not separator2:
fail("Variable or function name is not marked correctly in fragment: {}".format(text))
return (before, varname, after2)
# buildifier: disable=function-docstring
def do_function_call(text, shell_context):
(funname, after) = get_function_name(text.strip(" "))
if not funname:
return text
if funname == "export":
return replace_exports(after, shell_context)
arguments = split_arguments(after.strip(" ")) if after else []
return call_shell(shell_context, funname, *arguments)
# buildifier: disable=function-docstring
def split_arguments(text):
parts = []
current = text.strip(" ")
# long enough
for i in range(1, 100):
if not current:
break
# we are ignoring escaped quotes
(before, separator, after) = current.partition("\"")
if not separator:
parts += current.split(" ")
break
(quoted, separator2, after2) = after.partition("\"")
if not separator2:
fail("Incorrect quoting in fragment: {}".format(current))
before = before.strip(" ")
if before:
parts += before.split(" ")
parts.append("\"" + quoted + "\"")
current = after2
return parts

View File

@ -0,0 +1,7 @@
load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
bzl_library(
name = "bzl_srcs",
srcs = glob(["**/*.bzl"]),
visibility = ["//:__subpackages__"],
)

View File

@ -0,0 +1,88 @@
# buildifier: disable=module-docstring
def _provider_text(symbols):
return """
WRAPPER = provider(
doc = "Wrapper to hold imported methods",
fields = [{}]
)
""".format(", ".join(["\"%s\"" % symbol_ for symbol_ in symbols]))
def _getter_text():
return """
def id_from_file(file_name):
(before, middle, after) = file_name.partition(".")
return before
def get(file_name):
id = id_from_file(file_name)
return WRAPPER(**_MAPPING[id])
"""
def _mapping_text(ids):
data_ = []
for id in ids:
data_.append("{id} = wrapper_{id}".format(id = id))
return "_MAPPING = dict(\n{data}\n)".format(data = ",\n".join(data_))
def _load_and_wrapper_text(id, file_path, symbols):
load_list = ", ".join(["{id}_{symbol} = \"{symbol}\"".format(id = id, symbol = symbol_) for symbol_ in symbols])
load_statement = "load(\":{file}\", {list})".format(file = file_path, list = load_list)
data = ", ".join(["{symbol} = {id}_{symbol}".format(id = id, symbol = symbol_) for symbol_ in symbols])
wrapper_statement = "wrapper_{id} = dict({data})".format(id = id, data = data)
return struct(
load_ = load_statement,
wrapper = wrapper_statement,
)
def id_from_file(file_name):
(before, middle, after) = file_name.partition(".")
return before
def get_file_name(file_label):
(before, separator, after) = file_label.partition(":")
return id_from_file(after)
def _copy_file(rctx, src):
src_path = rctx.path(src)
copy_path = src_path.basename
rctx.template(copy_path, src_path)
return copy_path
_BUILD_FILE = """\
exports_files(
[
"toolchain_data_defs.bzl",
],
visibility = ["//visibility:public"],
)
"""
def _generate_overloads(rctx):
symbols = rctx.attr.symbols
ids = []
lines = ["# Generated overload mappings"]
loads = []
wrappers = []
for file_ in rctx.attr.files:
id = id_from_file(file_.name)
ids.append(id)
copy = _copy_file(rctx, file_)
load_and_wrapper = _load_and_wrapper_text(id, copy, symbols)
loads.append(load_and_wrapper.load_)
wrappers.append(load_and_wrapper.wrapper)
lines += loads
lines += wrappers
lines.append(_mapping_text(ids))
lines.append(_provider_text(symbols))
lines.append(_getter_text())
rctx.file("toolchain_data_defs.bzl", "\n".join(lines))
rctx.file("BUILD", _BUILD_FILE)
generate_overloads = repository_rule(
implementation = _generate_overloads,
attrs = {
"files": attr.label_list(),
"symbols": attr.string_list(),
},
)

View File

@ -0,0 +1,20 @@
load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
load(":defs.bzl", "build_part")
toolchain_type(
name = "shell_commands",
visibility = ["//visibility:public"],
)
build_part(
toolchain_type = ":shell_commands",
)
bzl_library(
name = "bzl_srcs",
srcs = glob(["**/*.bzl"]) + [
"@rules_foreign_cc_commands_overloads//:toolchain_data_defs.bzl",
],
visibility = ["//:__subpackages__"],
deps = ["//foreign_cc/private/shell_toolchain/toolchains/impl:bzl_srcs"],
)

View File

@ -0,0 +1,63 @@
# buildifier: disable=module-docstring
load("//foreign_cc/private/shell_toolchain/toolchains:commands.bzl", "PLATFORM_COMMANDS")
load(":function_and_call.bzl", "FunctionAndCall")
_function_and_call_type = type(FunctionAndCall(text = ""))
def create_context(ctx):
return struct(
shell = ctx.toolchains["@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands"].data,
prelude = {},
)
# buildifier: disable=function-docstring-header
# buildifier: disable=function-docstring-args
# buildifier: disable=function-docstring-return
def call_shell(shell_context, method_, *args):
"""Calls the 'method_' shell command from the toolchain.
Checks the number and types of passed arguments.
If the command returns the resulting text wrapped into FunctionAndCall provider,
puts the text of the function into the 'prelude' dictionary in the 'shell_context',
and returns only the call of that function.
"""
check_argument_types(method_, args)
func_ = getattr(shell_context.shell, method_)
result = func_(*args)
if type(result) == _function_and_call_type:
# If needed, add function definition to the prelude part of the script
if not shell_context.prelude.get(method_):
define_function = getattr(shell_context.shell, "define_function")
shell_context.prelude[method_] = define_function(method_, result.text)
# use provided method of calling a defined function or use default
if hasattr(result, "call"):
return result.call
return " ".join([method_] + [_wrap_if_needed(str(arg)) for arg in args])
return result
def _quoted(arg):
return arg.startswith("\"") and arg.endswith("\"")
def _wrap_if_needed(arg):
if arg.find(" ") >= 0 and not _quoted(arg):
return "\"" + arg + "\""
return arg
# buildifier: disable=function-docstring
def check_argument_types(method_, args_list):
descriptor = PLATFORM_COMMANDS[method_]
args_info = descriptor.arguments
if len(args_list) != len(args_info):
fail("Wrong number ({}) of arguments ({}) in a call to '{}'".format(
len(args_list),
str(args_list),
method_,
))
for idx in range(0, len(args_list)):
if type(args_list[idx]) != args_info[idx].type_:
fail("Wrong argument '{}' type: '{}'".format(args_info[idx].name, type(args_list[idx])))

View File

@ -0,0 +1,194 @@
# buildifier: disable=module-docstring
CommandInfo = provider(
doc = "To pass information about the Starlark function, defining the shell fragment.",
fields = {
"arguments": "Arguments of Starlark function",
"doc": "What the shell script fragment, generated by this function, does.",
},
)
ArgumentInfo = provider(
doc = "",
fields = {
"doc": "",
"name": "",
"type_": "",
},
)
PLATFORM_COMMANDS = {
"assert_script_errors": CommandInfo(
arguments = [],
doc = "Script fragment that stops the execution after any error",
),
"cat": CommandInfo(
arguments = [
ArgumentInfo(name = "filepath", type_ = type(""), doc = "Path to the file"),
],
doc = "Output the file contents to stdout",
),
"children_to_path": CommandInfo(
arguments = [
ArgumentInfo(name = "dir_", type_ = type(""), doc = "Directory"),
],
doc = "Put all immediate subdirectories (and symlinks) into PATH",
),
"cleanup_function": CommandInfo(
arguments = [
ArgumentInfo(name = "on_success", type_ = type(""), doc = "Command(s) to be executed on success"),
ArgumentInfo(name = "on_failure", type_ = type(""), doc = "Command(s) to be executed on failure"),
],
doc = "Trap function called after the script is finished",
# doc = "Read the result of the execution of the previous command, execute success or failure callbacks",
),
"copy_dir_contents_to_dir": CommandInfo(
arguments = [
ArgumentInfo(
name = "source",
type_ = type(""),
doc = "Source directory, immediate children of which are copied",
),
ArgumentInfo(name = "target", type_ = type(""), doc = "Target directory"),
],
doc = "Copies contents of the directory to target directory",
),
"define_absolute_paths": CommandInfo(
arguments = [
ArgumentInfo(name = "dir_", type_ = type(""), doc = "Directory where to replace"),
ArgumentInfo(name = "abs_path", type_ = type(""), doc = "Absolute path value"),
],
doc = "Replaces absolute path placeholder inside 'dir_' with a provided value 'abs_path'",
),
"define_function": CommandInfo(
arguments = [
ArgumentInfo(name = "name", type_ = type(""), doc = "Function name"),
ArgumentInfo(name = "text", type_ = type(""), doc = "Function body"),
],
doc = "Defines a function with 'text' as the function body.",
),
"echo": CommandInfo(
arguments = [ArgumentInfo(name = "text", type_ = type(""), doc = "Text to output")],
doc = "Outputs 'text' to stdout",
),
"env": CommandInfo(
arguments = [],
doc = "Print all environment variables",
),
"export_var": CommandInfo(
arguments = [
ArgumentInfo(name = "name", type_ = type(""), doc = "Variable name"),
ArgumentInfo(name = "value", type_ = type(""), doc = "Variable value"),
],
doc = "Defines and exports environment variable.",
),
"if_else": CommandInfo(
doc = "Creates if-else construct",
arguments = [
ArgumentInfo(name = "condition", type_ = type(""), doc = "Condition text"),
ArgumentInfo(name = "if_text", type_ = type(""), doc = "If block text"),
ArgumentInfo(name = "else_text", type_ = type(""), doc = "Else block text"),
],
),
"increment_pkg_config_path": CommandInfo(
arguments = [
ArgumentInfo(
name = "source",
type_ = type(""),
doc = "Source directory",
),
],
doc = (
"Find subdirectory inside a passed directory with *.pc files and add it " +
"to the PKG_CONFIG_PATH"
),
),
"local_var": CommandInfo(
arguments = [
ArgumentInfo(name = "name", type_ = type(""), doc = "Variable name"),
ArgumentInfo(name = "value", type_ = type(""), doc = "Variable value"),
],
doc = "Defines local shell variable.",
),
"mkdirs": CommandInfo(
arguments = [ArgumentInfo(name = "path", type_ = type(""), doc = "Path to directory")],
doc = "Creates a directory and, if neccesary, it's parents",
),
"os_name": CommandInfo(
arguments = [],
doc = "Returns OS name",
),
"path": CommandInfo(
arguments = [
ArgumentInfo(name = "expression", type_ = type(""), doc = "Path"),
],
doc = "Adds passed arguments in the beginning of the PATH.",
),
"pwd": CommandInfo(
arguments = [],
doc = "Returns command for getting current directory.",
),
"redirect_out_err": CommandInfo(
arguments = [
ArgumentInfo(name = "from_process", type_ = type(""), doc = "Process to run"),
ArgumentInfo(name = "to_file", type_ = type(""), doc = "File to redirect output to"),
],
doc = "Read the result of the execution of the previous command, execute success or failure callbacks",
),
"replace_absolute_paths": CommandInfo(
arguments = [
ArgumentInfo(name = "dir_", type_ = type(""), doc = "Directory where to replace"),
ArgumentInfo(name = "abs_path", type_ = type(""), doc = "Absolute path value"),
],
doc = "Replaces absolute path 'abs_path' inside 'dir_' with a placeholder value",
),
"replace_in_files": CommandInfo(
arguments = [
ArgumentInfo(name = "dir_", type_ = type(""), doc = "Directory to search recursively"),
ArgumentInfo(name = "from_", type_ = type(""), doc = "String to be replaced"),
ArgumentInfo(name = "to_", type_ = type(""), doc = "Replace target"),
],
doc = "Replaces all occurrences of 'from_' to 'to_' recursively in the directory 'dir'.",
),
"script_prelude": CommandInfo(
arguments = [],
doc = "Function for setting necessary environment variables for the platform",
),
"symlink_contents_to_dir": CommandInfo(
arguments = [
ArgumentInfo(
name = "source",
type_ = type(""),
doc = "Source directory, immediate children of which are symlinked, or file to be symlinked.",
),
ArgumentInfo(name = "target", type_ = type(""), doc = "Target directory"),
],
doc = (
"Symlink contents of the directory to target directory (create the target directory if needed). " +
"If file is passed, symlink it into the target directory."
),
),
"symlink_to_dir": CommandInfo(
arguments = [
ArgumentInfo(
name = "source",
type_ = type(""),
doc = "Source directory",
),
ArgumentInfo(name = "target", type_ = type(""), doc = "Target directory"),
],
doc = (
"Symlink all files from source directory to target directory (create the target directory if needed). " +
"NB symlinks from the source directory are copied."
),
),
"touch": CommandInfo(
arguments = [ArgumentInfo(name = "path", type_ = type(""), doc = "Path to file")],
doc = "Creates a file",
),
"use_var": CommandInfo(
arguments = [
ArgumentInfo(name = "name", type_ = type(""), doc = "Variable name"),
],
doc = "Expression to address the variable.",
),
}

View File

@ -0,0 +1,39 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc_commands_overloads//:toolchain_data_defs.bzl", "get")
load(
"//foreign_cc/private/shell_toolchain/polymorphism:generate_overloads.bzl",
"get_file_name",
)
load(":toolchain_mappings.bzl", "TOOLCHAIN_MAPPINGS")
def _toolchain_data(ctx):
return platform_common.ToolchainInfo(data = get(ctx.attr.file_name))
toolchain_data = rule(
implementation = _toolchain_data,
attrs = {
"file_name": attr.string(),
},
)
# buildifier: disable=unnamed-macro
def build_part(toolchain_type):
register_mappings(toolchain_type, TOOLCHAIN_MAPPINGS)
# buildifier: disable=unnamed-macro
def register_mappings(toolchain_type, mappings):
for item in mappings:
file_name = get_file_name(item.file)
toolchain_data(
name = file_name + "_data",
file_name = file_name,
visibility = ["//visibility:public"],
)
native.toolchain(
name = file_name,
toolchain_type = toolchain_type,
toolchain = file_name + "_data",
exec_compatible_with = item.exec_compatible_with if hasattr(item, "exec_compatible_with") else [],
target_compatible_with = item.target_compatible_with if hasattr(item, "target_compatible_with") else [],
)

View File

@ -0,0 +1,9 @@
# buildifier: disable=module-docstring
# buildifier: disable=name-conventions
FunctionAndCall = provider(
doc = "Wrapper to pass function definition and (if custom) function call",
fields = {
"call": "How to call defined function, if different from <function-name> <arg1> ...<argn>",
"text": "Function body, without wrapping function <name>() {} fragment.",
},
)

View File

@ -0,0 +1,7 @@
load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
bzl_library(
name = "bzl_srcs",
srcs = glob(["**/*.bzl"]),
visibility = ["//:__subpackages__"],
)

View File

@ -0,0 +1,172 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
_REPLACE_VALUE = "\\${EXT_BUILD_DEPS}"
def os_name():
return "linux"
def pwd():
return "$(pwd)"
def echo(text):
return "echo \"{text}\"".format(text = text)
def export_var(name, value):
return "export {name}={value}".format(name = name, value = value)
def local_var(name, value):
return "local {name}={value}".format(name = name, value = value)
def use_var(name):
return "$" + name
def env():
return "env"
def path(expression):
return "export PATH=\"{expression}:$PATH\"".format(expression = expression)
def touch(path):
return "touch " + path
def mkdirs(path):
return "mkdir -p " + path
def if_else(condition, if_text, else_text):
return """\
if [ {condition} ]; then
{if_text}
else
{else_text}
fi
""".format(condition = condition, if_text = if_text, else_text = else_text)
# buildifier: disable=function-docstring
def define_function(name, text):
lines = []
lines.append("function " + name + "() {")
for line_ in text.splitlines():
lines.append(" " + line_)
lines.append("}")
return "\n".join(lines)
def replace_in_files(dir, from_, to_):
return FunctionAndCall(
text = """\
if [ -d "$1" ]; then
find -L $1 -type f \\( -name "*.pc" -or -name "*.la" -or -name "*-config" -or -name "*.cmake" \\) -exec sed -i 's@'"$2"'@'"$3"'@g' {} ';'
fi
""",
)
def copy_dir_contents_to_dir(source, target):
return """cp -L -p -r --no-target-directory "{}" "{}" """.format(source, target)
def symlink_contents_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
##symlink_to_dir## "$1" "$target"
elif [[ -L "$1" ]]; then
local actual=$(readlink "$1")
##symlink_contents_to_dir## "$actual" "$target"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($(find -H "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
for child in "${children[@]:-}"; do
##symlink_to_dir## "$child" "$target"
done
fi
"""
return FunctionAndCall(text = text)
def symlink_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
ln -s -f -t "$target" "$1"
elif [[ -L "$1" ]]; then
local actual=$(readlink "$1")
##symlink_to_dir## "$actual" "$target"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($(find -H "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
local dirname=$(basename "$1")
for child in "${children[@]:-}"; do
if [[ "$dirname" != *.ext_build_deps ]]; then
##symlink_to_dir## "$child" "$target/$dirname"
fi
done
else
echo "Can not copy $1"
fi
"""
return FunctionAndCall(text = text)
def script_prelude():
return "set -euo pipefail"
def increment_pkg_config_path(source):
text = """\
local children=$(find $1 -mindepth 1 -name '*.pc')
# assume there is only one directory with pkg config
for child in $children; do
export PKG_CONFIG_PATH="$${PKG_CONFIG_PATH:-}$$:$(dirname $child)"
return
done
"""
return FunctionAndCall(text = text)
def cat(filepath):
return "cat \"{}\"".format(filepath)
def redirect_out_err(from_process, to_file):
return from_process + " &> " + to_file
def assert_script_errors():
return "set -e"
def cleanup_function(on_success, on_failure):
text = "\n".join([
"local ecode=$?",
"if [ $ecode -eq 0 ]; then",
on_success,
"else",
on_failure,
"fi",
])
return FunctionAndCall(text = text, call = "trap \"cleanup_function\" EXIT")
def children_to_path(dir_):
text = """\
if [ -d {dir_} ]; then
local tools=$(find $EXT_BUILD_DEPS/bin -maxdepth 1 -mindepth 1)
for tool in $tools;
do
if [[ -d \"$tool\" ]] || [[ -L \"$tool\" ]]; then
export PATH=$PATH:$tool
fi
done
fi""".format(dir_ = dir_)
return FunctionAndCall(text = text)
def define_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {REPLACE_VALUE} {abs_path}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)
def replace_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {abs_path} {REPLACE_VALUE}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)

View File

@ -0,0 +1,172 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
_REPLACE_VALUE = "\\${EXT_BUILD_DEPS}"
def os_name():
return "linux"
def pwd():
return "$(pwd)"
def echo(text):
return "echo \"{text}\"".format(text = text)
def export_var(name, value):
return "export {name}={value}".format(name = name, value = value)
def local_var(name, value):
return "local {name}={value}".format(name = name, value = value)
def use_var(name):
return "$" + name
def env():
return "env"
def path(expression):
return "export PATH=\"{expression}:$PATH\"".format(expression = expression)
def touch(path):
return "touch " + path
def mkdirs(path):
return "mkdir -p " + path
def if_else(condition, if_text, else_text):
return """
if [ {condition} ]; then
{if_text}
else
{else_text}
fi
""".format(condition = condition, if_text = if_text, else_text = else_text)
# buildifier: disable=function-docstring
def define_function(name, text):
lines = []
lines.append("function " + name + "() {")
for line_ in text.splitlines():
lines.append(" " + line_)
lines.append("}")
return "\n".join(lines)
def replace_in_files(dir, from_, to_):
return FunctionAndCall(
text = """\
if [ -d "$1" ]; then
find -L $1 -type f \\( -name "*.pc" -or -name "*.la" -or -name "*-config" -or -name "*.cmake" \\) -exec sed -i 's@'"$2"'@'"$3"'@g' {} ';'
fi
""",
)
def copy_dir_contents_to_dir(source, target):
return """cp -L -p -r --no-target-directory "{}" "{}" """.format(source, target)
def symlink_contents_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
##symlink_to_dir## "$1" "$target"
elif [[ -L "$1" ]]; then
local actual=$(readlink "$1")
##symlink_contents_to_dir## "$actual" "$target"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($(find -H "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
for child in "${children[@]:-}"; do
##symlink_to_dir## "$child" "$target"
done
fi
"""
return FunctionAndCall(text = text)
def symlink_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
ln -s -f -t "$target" "$1"
elif [[ -L "$1" && ! -d "$1" ]]; then
cp "$1" "$2"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($(find -H "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
local dirname=$(basename "$1")
mkdir -p "$target/$dirname"
for child in "${children[@]:-}"; do
if [[ "$dirname" != *.ext_build_deps ]]; then
##symlink_to_dir## "$child" "$target/$dirname"
fi
done
else
echo "Can not copy $1"
fi
"""
return FunctionAndCall(text = text)
def script_prelude():
return "set -euo pipefail"
def increment_pkg_config_path(source):
text = """\
local children=$(find $1 -mindepth 1 -name '*.pc')
# assume there is only one directory with pkg config
for child in $children; do
export PKG_CONFIG_PATH="$${PKG_CONFIG_PATH:-}$$:$(dirname $child)"
return
done
"""
return FunctionAndCall(text = text)
def cat(filepath):
return "cat \"{}\"".format(filepath)
def redirect_out_err(from_process, to_file):
return from_process + " &> " + to_file
def assert_script_errors():
return "set -e"
def cleanup_function(on_success, on_failure):
text = "\n".join([
"local ecode=$?",
"if [ $ecode -eq 0 ]; then",
on_success,
"else",
on_failure,
"fi",
])
return FunctionAndCall(text = text, call = "trap \"cleanup_function\" EXIT")
def children_to_path(dir_):
text = """\
if [ -d {dir_} ]; then
local tools=$(find $EXT_BUILD_DEPS/bin -maxdepth 1 -mindepth 1)
for tool in $tools;
do
if [[ -d \"$tool\" ]] || [[ -L \"$tool\" ]]; then
export PATH=$PATH:$tool
fi
done
fi""".format(dir_ = dir_)
return FunctionAndCall(text = text)
def define_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {REPLACE_VALUE} {abs_path}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)
def replace_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {abs_path} {REPLACE_VALUE}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)

View File

@ -0,0 +1,198 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
_REPLACE_VALUE = "\\${EXT_BUILD_DEPS}"
def os_name():
return "osx"
def pwd():
return "$(pwd)"
def echo(text):
return "echo \"{text}\"".format(text = text)
def export_var(name, value):
return "export {name}={value}".format(name = name, value = value)
def local_var(name, value):
return "local {name}={value}".format(name = name, value = value)
def use_var(name):
return "$" + name
def env():
return "env"
def path(expression):
return "export PATH=\"{expression}:$PATH\"".format(expression = expression)
def touch(path):
return "touch " + path
def mkdirs(path):
return "mkdir -p " + path
def if_else(condition, if_text, else_text):
return """\
if [ {condition} ]; then
{if_text}
else
{else_text}
fi
""".format(condition = condition, if_text = if_text, else_text = else_text)
# buildifier: disable=function-docstring
def define_function(name, text):
lines = []
lines.append("function " + name + "() {")
for line_ in text.splitlines():
lines.append(" " + line_)
lines.append("}")
return "\n".join(lines)
def replace_in_files(dir, from_, to_):
return FunctionAndCall(
text = """\
if [ -d "$1" ]; then
find -L -f $1 \\( -name "*.pc" -or -name "*.la" -or -name "*-config" -or -name "*.cmake" \\) -exec sed -i -e 's@'"$2"'@'"$3"'@g' {} ';'
fi
""",
)
def copy_dir_contents_to_dir(source, target):
text = """\
SAVEIFS=$IFS
IFS=$'\n'
local children=($(find "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
local target="$2"
mkdir -p "${target}"
for child in "${children[@]:-}"; do
if [[ -f "$child" ]]; then
cp -p "$child" "$target"
elif [[ -L "$child" ]]; then
local actual=$(readlink "$child")
if [[ -f "$actual" ]]; then
cp "$actual" "$target"
else
local dirn=$(basename "$actual")
mkdir -p "$target/$dirn"
##copy_dir_contents_to_dir## "$actual" "$target/$dirn"
fi
elif [[ -d "$child" ]]; then
local dirn=$(basename "$child")
mkdir -p "$target/$dirn"
##copy_dir_contents_to_dir## "$child" "$target/$dirn"
fi
done
"""
return FunctionAndCall(text = text)
def symlink_contents_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
##symlink_to_dir## "$1" "$target"
elif [[ -L "$1" && ! -d "$1" ]]; then
local actual=$(readlink "$1")
##symlink_contents_to_dir## "$actual" "$target"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($(find "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
for child in "${children[@]:-}"; do
##symlink_to_dir## "$child" "$target"
done
fi
"""
return FunctionAndCall(text = text)
def symlink_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
ln -s -f "$1" "$target"
elif [[ -L "$1" && ! -d "$1" ]]; then
cp "$1" "$2"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($(find "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
local dirname=$(basename "$1")
mkdir -p "$target/$dirname"
for child in "${children[@]:-}"; do
if [[ "$dirname" != *.ext_build_deps ]]; then
##symlink_to_dir## "$child" "$target/$dirname"
fi
done
else
echo "Can not copy $1"
fi
"""
return FunctionAndCall(text = text)
def script_prelude():
return "set -euo pipefail"
def increment_pkg_config_path(source):
text = """\
local children=$(find $1 -mindepth 1 -name '*.pc')
# assume there is only one directory with pkg config
for child in $children; do
export PKG_CONFIG_PATH="$${PKG_CONFIG_PATH:-}$$:$(dirname $child)"
return
done
"""
return FunctionAndCall(text = text)
def cat(filepath):
return "cat \"{}\"".format(filepath)
def redirect_out_err(from_process, to_file):
return from_process + " &> " + to_file
def assert_script_errors():
return "set -e"
def cleanup_function(on_success, on_failure):
text = "\n".join([
"local ecode=$?",
"if [ $ecode -eq 0 ]; then",
on_success,
"else",
on_failure,
"fi",
])
return FunctionAndCall(text = text, call = "trap \"cleanup_function\" EXIT")
def children_to_path(dir_):
text = """\
if [ -d {dir_} ]; then
local tools=$(find $EXT_BUILD_DEPS/bin -maxdepth 1 -mindepth 1)
for tool in $tools;
do
if [[ -d \"$tool\" ]] || [[ -L \"$tool\" ]]; then
export PATH=$PATH:$tool
fi
done
fi""".format(dir_ = dir_)
return FunctionAndCall(text = text)
def define_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {REPLACE_VALUE} {abs_path}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)
def replace_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {abs_path} {REPLACE_VALUE}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)

View File

@ -0,0 +1,182 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
_REPLACE_VALUE = "\\${EXT_BUILD_DEPS}"
def os_name():
return "windows"
def pwd():
return "$(type -t cygpath > /dev/null && cygpath $(pwd) -w || pwd -W)"
def echo(text):
return "echo \"{text}\"".format(text = text)
def export_var(name, value):
return "export {name}={value}".format(name = name, value = value)
def local_var(name, value):
return "local {name}={value}".format(name = name, value = value)
def use_var(name):
return "$" + name
def env():
return "env"
def path(expression):
return "export PATH=\"{expression}:$PATH\"".format(expression = expression)
def touch(path):
return "touch " + path
def mkdirs(path):
return "mkdir -p " + path
def if_else(condition, if_text, else_text):
return """\
if [ {condition} ]; then
{if_text}
else
{else_text}
fi
""".format(condition = condition, if_text = if_text, else_text = else_text)
# buildifier: disable=function-docstring
def define_function(name, text):
lines = []
lines.append("function " + name + "() {")
for line_ in text.splitlines():
lines.append(" " + line_)
lines.append("}")
return "\n".join(lines)
def replace_in_files(dir, from_, to_):
return FunctionAndCall(
text = """\
if [ -d "$1" ]; then
$REAL_FIND -L $1 -type f \\( -name "*.pc" -or -name "*.la" -or -name "*-config" -or -name "*.cmake" \\) -exec sed -i 's@'"$2"'@'"$3"'@g' {} ';'
fi
""",
)
def copy_dir_contents_to_dir(source, target):
return """cp -L -p -r --no-target-directory "{}" "{}" """.format(source, target)
def symlink_contents_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
##symlink_to_dir## "$1" "$target"
elif [[ -L "$1" ]]; then
local actual=$(readlink "$1")
##symlink_contents_to_dir## "$actual" "$target"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($($REAL_FIND -H "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
for child in "${children[@]}"; do
##symlink_to_dir## "$child" "$target"
done
fi
"""
return FunctionAndCall(text = text)
def symlink_to_dir(source, target):
text = """\
local target="$2"
mkdir -p "$target"
if [[ -f "$1" ]]; then
ln -s -f -t "$target" "$1"
elif [[ -L "$1" ]]; then
local actual=$(readlink "$1")
##symlink_to_dir## "$actual" "$target"
elif [[ -d "$1" ]]; then
SAVEIFS=$IFS
IFS=$'\n'
local children=($($REAL_FIND -H "$1" -maxdepth 1 -mindepth 1))
IFS=$SAVEIFS
local dirname=$(basename "$1")
for child in "${children[@]}"; do
if [[ "$dirname" != *.ext_build_deps ]]; then
##symlink_to_dir## "$child" "$target/$dirname"
fi
done
else
echo "Can not copy $1"
fi
"""
return FunctionAndCall(text = text)
def script_prelude():
return """\
set -euo pipefail
if [ -f /usr/bin/find ]; then
REAL_FIND="/usr/bin/find"
else
REAL_FIND="$(which find)"
fi
export MSYS_NO_PATHCONV=1
export MSYS2_ARG_CONV_EXCL="*"
export SYSTEMDRIVE="C:"
"""
def increment_pkg_config_path(source):
text = """\
local children=$($REAL_FIND $1 -mindepth 1 -name '*.pc')
# assume there is only one directory with pkg config
for child in $children; do
export PKG_CONFIG_PATH="$${PKG_CONFIG_PATH:-}$$:$(dirname $child)"
return
done
"""
return FunctionAndCall(text = text)
def cat(filepath):
return "cat \"{}\"".format(filepath)
def redirect_out_err(from_process, to_file):
return from_process + " &> " + to_file
def assert_script_errors():
return "set -e"
def cleanup_function(on_success, on_failure):
text = "\n".join([
"local ecode=$?",
"if [ $ecode -eq 0 ]; then",
on_success,
"else",
on_failure,
"fi",
])
return FunctionAndCall(text = text, call = "trap \"cleanup_function\" EXIT")
def children_to_path(dir_):
text = """\
if [ -d {dir_} ]; then
local tools=$($REAL_FIND $EXT_BUILD_DEPS/bin -maxdepth 1 -mindepth 1)
for tool in $tools;
do
if [[ -d \"$tool\" ]] || [[ -L \"$tool\" ]]; then
export PATH=$PATH:$tool
fi
done
fi""".format(dir_ = dir_)
return FunctionAndCall(text = text)
def define_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {REPLACE_VALUE} {abs_path}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)
def replace_absolute_paths(dir_, abs_path):
return "##replace_in_files## {dir_} {abs_path} {REPLACE_VALUE}".format(
dir_ = dir_,
REPLACE_VALUE = _REPLACE_VALUE,
abs_path = abs_path,
)

View File

@ -0,0 +1,34 @@
# buildifier: disable=module-docstring
# buildifier: disable=name-conventions
ToolchainMapping = provider(
doc = "Mapping of toolchain definition files to platform constraints",
fields = {
"exec_compatible_with": "Compatible execution platform constraints",
"file": "Toolchain definition file",
"target_compatible_with": "Compatible target platform constraints",
},
)
TOOLCHAIN_MAPPINGS = [
ToolchainMapping(
exec_compatible_with = [
"@platforms//os:linux",
],
file = "@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains/impl:linux_commands.bzl",
),
ToolchainMapping(
exec_compatible_with = [
"@platforms//os:windows",
],
file = "@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains/impl:windows_commands.bzl",
),
ToolchainMapping(
exec_compatible_with = [
"@platforms//os:macos",
],
file = "@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains/impl:macos_commands.bzl",
),
ToolchainMapping(
file = "@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains/impl:default_commands.bzl",
),
]

View File

@ -0,0 +1,31 @@
# buildifier: disable=module-docstring
load(
"//foreign_cc/private/shell_toolchain/polymorphism:generate_overloads.bzl",
"generate_overloads",
"get_file_name",
)
load("//foreign_cc/private/shell_toolchain/toolchains:commands.bzl", "PLATFORM_COMMANDS")
load(":toolchain_mappings.bzl", "TOOLCHAIN_MAPPINGS")
# buildifier: disable=unnamed-macro
# buildifier: disable=function-docstring
def workspace_part(
additional_toolchain_mappings = [],
additonal_shell_toolchain_package = None):
mappings = additional_toolchain_mappings + TOOLCHAIN_MAPPINGS
generate_overloads(
name = "rules_foreign_cc_commands_overloads",
files = [item.file for item in mappings],
symbols = PLATFORM_COMMANDS.keys(),
)
ordered_toolchains = []
for item in additional_toolchain_mappings:
if not additonal_shell_toolchain_package:
fail("Please specify the package, where the toolchains will be created")
if not additonal_shell_toolchain_package.endswith(":"):
additonal_shell_toolchain_package = additonal_shell_toolchain_package + ":"
ordered_toolchains.append(additonal_shell_toolchain_package + get_file_name(item.file))
prefix = "@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:"
for item in TOOLCHAIN_MAPPINGS:
ordered_toolchains.append(prefix + get_file_name(item.file))
native.register_toolchains(*ordered_toolchains)

24
foreign_cc/providers.bzl Normal file
View File

@ -0,0 +1,24 @@
""" A module containing all public facing providers """
# buildifier: disable=name-conventions
ForeignCcDeps = provider(
doc = """Provider to pass transitive information about external libraries.""",
fields = {"artifacts": "Depset of ForeignCcArtifact"},
)
# buildifier: disable=name-conventions
ForeignCcArtifact = provider(
doc = """Groups information about the external library install directory,
and relative bin, include and lib directories.
Serves to pass transitive information about externally built artifacts up the dependency chain.
Can not be used as a top-level provider.
Instances of ForeignCcArtifact are incapsulated in a depset ForeignCcDeps#artifacts.""",
fields = {
"bin_dir_name": "Bin directory, relative to install directory",
"gen_dir": "Install directory",
"include_dir_name": "Include directory, relative to install directory",
"lib_dir_name": "Lib directory, relative to install directory",
},
)

View File

@ -1,8 +1,12 @@
""" Unit tests for CMake script creation """
load("@bazel_skylib//lib:unittest.bzl", "asserts", "unittest")
load("//tools/build_defs:cc_toolchain_util.bzl", "CxxFlagsInfo", "CxxToolsInfo")
load("//tools/build_defs:cmake_script.bzl", "create_cmake_script", "export_for_test")
# buildifier: disable=bzl-visibility
load("//foreign_cc/private:cc_toolchain_util.bzl", "CxxFlagsInfo", "CxxToolsInfo")
# buildifier: disable=bzl-visibility
load("//foreign_cc/private:cmake_script.bzl", "create_cmake_script", "export_for_test")
def _absolutize_test(ctx):
env = unittest.begin(ctx)

View File

@ -1,14 +1,18 @@
""" Unit tests for shell script conversion """
load("@bazel_skylib//lib:unittest.bzl", "asserts", "unittest")
# buildifier: disable=bzl-visibility
load(
"//tools/build_defs:shell_script_helper.bzl",
"//foreign_cc/private:shell_script_helper.bzl",
"convert_shell_script_by_context",
"do_function_call",
"replace_var_ref",
"split_arguments",
)
load("//tools/build_defs/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
# buildifier: disable=bzl-visibility
load("//foreign_cc/private/shell_toolchain/toolchains:function_and_call.bzl", "FunctionAndCall")
def _use_var_linux(varname):
return "$" + varname

View File

@ -1,6 +1,7 @@
"""A helper rule for testing detect_root function."""
load("@rules_foreign_cc//tools/build_defs:detect_root.bzl", "detect_root")
# buildifier: disable=bzl-visibility
load("@rules_foreign_cc//foreign_cc/private:detect_root.bzl", "detect_root")
def _impl(ctx):
detected_root = detect_root(ctx.attr.srcs)

View File

@ -1,8 +1,6 @@
# buildifier: disable=module-docstring
load(
"@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl",
"convert_shell_script",
)
# buildifier: disable=bzl-visibility
load("//foreign_cc/private:shell_script_helper.bzl", "convert_shell_script")
def _impl(ctx):
text = convert_shell_script(ctx, ctx.attr.script)
@ -20,6 +18,6 @@ shell_script_helper_test_rule = rule(
"script": attr.string_list(mandatory = True),
},
toolchains = [
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
],
)

View File

@ -1,5 +1,6 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//tools/build_defs:cc_toolchain_util.bzl", "get_flags_info")
# buildifier: disable=bzl-visibility
load("@rules_foreign_cc//foreign_cc/private:cc_toolchain_util.bzl", "get_flags_info")
def _impl(ctx):
flags = get_flags_info(ctx)

View File

@ -1,11 +1,11 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//tools/build_defs:detect_root.bzl", "detect_root", "filter_containing_dirs_from_inputs")
load(
"@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl",
"convert_shell_script",
)
# buildifier: disable=bzl-visibility
load("@rules_foreign_cc//foreign_cc/private:detect_root.bzl", "detect_root", "filter_containing_dirs_from_inputs")
def _impl(ctx):
# buildifier: disable=bzl-visibility
load("@rules_foreign_cc//foreign_cc/private:shell_script_helper.bzl", "convert_shell_script")
def _symlink_contents_to_dir_test_rule_impl(ctx):
out = ctx.actions.declare_file(ctx.attr.out)
dir1 = detect_root(ctx.attr.dir1)
dir2 = detect_root(ctx.attr.dir2)
@ -30,13 +30,13 @@ def _impl(ctx):
return [DefaultInfo(files = depset([out]))]
symlink_contents_to_dir_test_rule = rule(
implementation = _impl,
implementation = _symlink_contents_to_dir_test_rule_impl,
attrs = {
"dir1": attr.label(allow_files = True),
"dir2": attr.label(allow_files = True),
"out": attr.string(),
},
toolchains = [
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
],
)

View File

@ -1,7 +1,9 @@
""" Unit tests for some utility functions """
load("@bazel_skylib//lib:unittest.bzl", "asserts", "unittest")
load("//tools/build_defs:framework.bzl", "uniq_list_keep_order")
# buildifier: disable=bzl-visibility
load("//foreign_cc/private:framework.bzl", "uniq_list_keep_order")
def _uniq_list_keep_order_test(ctx):
env = unittest.begin(ctx)

View File

@ -1,77 +1,8 @@
load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
alias(
name = "cmake_toolchain",
actual = "//toolchains:cmake_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
visibility = ["//visibility:public"],
)
alias(
name = "ninja_toolchain",
actual = "//toolchains:ninja_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
visibility = ["//visibility:public"],
)
alias(
name = "make_toolchain",
actual = "//toolchains:make_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
visibility = ["//visibility:public"],
)
alias(
name = "built_cmake_toolchain",
actual = "//toolchains:built_cmake_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
)
alias(
name = "built_ninja_toolchain",
actual = "//toolchains:built_ninja_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
)
alias(
name = "built_make_toolchain",
actual = "//toolchains:built_make_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
)
# Preinstalled cmake will always be the default, if toolchain with more exact constraints
# is not defined before; registered from workspace_definitions.bzl#rules_foreign_cc_dependencies
alias(
name = "preinstalled_cmake_toolchain",
actual = "//toolchains:preinstalled_cmake_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
)
# Preinstalled ninja will always be the default, if toolchain with more exact constraints
# is not defined before; registered from workspace_definitions.bzl#rules_foreign_cc_dependencies
alias(
name = "preinstalled_ninja_toolchain",
actual = "//toolchains:preinstalled_ninja_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
)
# Preinstalled make will always be the default, if toolchain with more exact constraints
# is not defined before; registered from workspace_definitions.bzl#rules_foreign_cc_dependencies
alias(
name = "preinstalled_make_toolchain",
actual = "//toolchains:preinstalled_make_toolchain",
deprecation = "This target has been moved to `@rules_foreign_cc//toolchains/...`",
)
bzl_library(
name = "bzl_srcs",
srcs = glob(["**/*.bzl"]),
visibility = ["//:__subpackages__"],
deps = [
"//tools/build_defs/native_tools:bzl_srcs",
"//tools/build_defs/shell_toolchain/polymorphism:bzl_srcs",
"//tools/build_defs/shell_toolchain/toolchains:bzl_srcs",
"@bazel_skylib//lib:collections",
"@bazel_skylib//lib:versions",
],
deps = [],
)

View File

@ -1,55 +1,8 @@
""" Rule for building Boost from sources. """
load("//tools/build_defs:detect_root.bzl", "detect_root")
load(
"//tools/build_defs:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load("//foreign_cc:defs.bzl", _boost_build = "boost_build")
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
def _boost_build(ctx):
attrs = create_attrs(
ctx.attr,
configure_name = "BuildBoost",
create_configure_script = _create_configure_script,
make_commands = ["./b2 install {} --prefix=.".format(" ".join(ctx.attr.user_options))],
)
return cc_external_rule_impl(ctx, attrs)
print_deprecation()
def _create_configure_script(configureParameters):
ctx = configureParameters.ctx
root = detect_root(ctx.attr.lib_source)
return "\n".join([
"cd $INSTALLDIR",
"##copy_dir_contents_to_dir## $$EXT_BUILD_ROOT$$/{}/. .".format(root),
"./bootstrap.sh {}".format(" ".join(ctx.attr.bootstrap_options)),
])
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"bootstrap_options": attr.string_list(
doc = "any additional flags to pass to bootstrap.sh",
mandatory = False,
),
"user_options": attr.string_list(
doc = "any additional flags to pass to b2",
mandatory = False,
),
})
return attrs
boost_build = rule(
doc = "Rule for building Boost. Invokes bootstrap.sh and then b2 install.",
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _boost_build,
toolchains = [
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)
boost_build = _boost_build

View File

@ -1,419 +1,30 @@
""" Defines create_linking_info, which wraps passed libraries into CcLinkingInfo
"""
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
load("@bazel_skylib//lib:collections.bzl", "collections")
# buildifier: disable=bzl-visibility
load(
"@bazel_tools//tools/build_defs/cc:action_names.bzl",
"ASSEMBLE_ACTION_NAME",
"CPP_COMPILE_ACTION_NAME",
"CPP_LINK_DYNAMIC_LIBRARY_ACTION_NAME",
"CPP_LINK_EXECUTABLE_ACTION_NAME",
"CPP_LINK_STATIC_LIBRARY_ACTION_NAME",
"C_COMPILE_ACTION_NAME",
"//foreign_cc/private:cc_toolchain_util.bzl",
_CxxFlagsInfo = "CxxFlagsInfo",
_CxxToolsInfo = "CxxToolsInfo",
_LibrariesToLinkInfo = "LibrariesToLinkInfo",
_absolutize_path_in_str = "absolutize_path_in_str",
_create_linking_info = "create_linking_info",
_get_env_vars = "get_env_vars",
_get_flags_info = "get_flags_info",
_get_tools_info = "get_tools_info",
_is_debug_mode = "is_debug_mode",
_targets_windows = "targets_windows",
)
load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain")
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
LibrariesToLinkInfo = provider(
doc = "Libraries to be wrapped into CcLinkingInfo",
fields = dict(
static_libraries = "Static library files, optional",
shared_libraries = "Shared library files, optional",
interface_libraries = "Interface library files, optional",
),
)
print_deprecation()
CxxToolsInfo = provider(
doc = "Paths to the C/C++ tools, taken from the toolchain",
fields = dict(
cc = "C compiler",
cxx = "C++ compiler",
cxx_linker_static = "C++ linker to link static library",
cxx_linker_executable = "C++ linker to link executable",
),
)
CxxFlagsInfo = provider(
doc = "Flags for the C/C++ tools, taken from the toolchain",
fields = dict(
cc = "C compiler flags",
cxx = "C++ compiler flags",
cxx_linker_shared = "C++ linker flags when linking shared library",
cxx_linker_static = "C++ linker flags when linking static library",
cxx_linker_executable = "C++ linker flags when linking executable",
assemble = "Assemble flags",
),
)
def _to_list(element):
if element == None:
return []
else:
return [element]
def _to_depset(element):
if element == None:
return depset()
return depset(element)
def _configure_features(ctx, cc_toolchain):
return cc_common.configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
requested_features = ctx.features,
unsupported_features = ctx.disabled_features,
)
def _create_libraries_to_link(ctx, files):
libs = []
static_map = _files_map(_filter(files.static_libraries or [], _is_position_independent, True))
pic_static_map = _files_map(_filter(files.static_libraries or [], _is_position_independent, False))
shared_map = _files_map(files.shared_libraries or [])
interface_map = _files_map(files.interface_libraries or [])
names = collections.uniq(static_map.keys() + pic_static_map.keys() + shared_map.keys() + interface_map.keys())
cc_toolchain = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
)
for name_ in names:
libs.append(cc_common.create_library_to_link(
actions = ctx.actions,
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain,
static_library = static_map.get(name_),
pic_static_library = pic_static_map.get(name_),
dynamic_library = shared_map.get(name_),
interface_library = interface_map.get(name_),
alwayslink = ctx.attr.alwayslink,
))
return depset(direct = libs)
def _is_position_independent(file):
return file.basename.endswith(".pic.a")
def _filter(list_, predicate, inverse):
result = []
for elem in list_:
check = predicate(elem)
if not inverse and check or inverse and not check:
result.append(elem)
return result
def _files_map(files_list):
by_names_map = {}
for file_ in files_list:
name_ = _file_name_no_ext(file_.basename)
value = by_names_map.get(name_)
if value:
fail("Can not have libraries with the same name in the same category")
by_names_map[name_] = file_
return by_names_map
def _defines_from_deps(ctx):
return depset(transitive = [dep[CcInfo].compilation_context.defines for dep in ctx.attr.deps])
def _build_cc_link_params(
ctx,
user_link_flags,
static_libraries,
dynamic_libraries,
runtime_artifacts):
static_shared = None
static_no_shared = None
if static_libraries != None and len(static_libraries) > 0:
static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(static_libraries),
)
static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(static_libraries),
)
else:
static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
no_static_shared = None
no_static_no_shared = None
if dynamic_libraries != None and len(dynamic_libraries) > 0:
no_static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
no_static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(dynamic_libraries),
dynamic_libraries_for_runtime = _to_depset(runtime_artifacts),
)
else:
no_static_shared = cc_common.create_cc_link_params(
ctx = ctx,
user_link_flags = user_link_flags,
libraries_to_link = _to_depset(static_libraries),
)
no_static_no_shared = cc_common.create_cc_link_params(
ctx = ctx,
libraries_to_link = _to_depset(static_libraries),
)
return {
"dynamic_mode_params_for_dynamic_library": no_static_shared,
"dynamic_mode_params_for_executable": no_static_no_shared,
"static_mode_params_for_dynamic_library": static_shared,
"static_mode_params_for_executable": static_no_shared,
}
def targets_windows(ctx, cc_toolchain):
"""Returns true if build is targeting Windows
Args:
ctx: rule context
cc_toolchain: optional - Cc toolchain
"""
toolchain = cc_toolchain if cc_toolchain else find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = toolchain,
)
return cc_common.is_enabled(
feature_configuration = feature_configuration,
feature_name = "targets_windows",
)
def create_linking_info(ctx, user_link_flags, files):
"""Creates CcLinkingInfo for the passed user link options and libraries.
Args:
ctx (ctx): rule context
user_link_flags (list of strings): link optins, provided by user
files (LibrariesToLink): provider with the library files
"""
return cc_common.create_linking_context(
linker_inputs = depset(direct = [
cc_common.create_linker_input(
owner = ctx.label,
libraries = _create_libraries_to_link(ctx, files),
user_link_flags = depset(direct = user_link_flags),
),
]),
)
# buildifier: disable=function-docstring
def get_env_vars(ctx):
cc_toolchain = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
)
copts = ctx.attr.copts if hasattr(ctx.attr, "copts") else []
vars = dict()
for action_name in [C_COMPILE_ACTION_NAME, CPP_LINK_STATIC_LIBRARY_ACTION_NAME, CPP_LINK_EXECUTABLE_ACTION_NAME]:
vars.update(cc_common.get_environment_variables(
feature_configuration = feature_configuration,
action_name = action_name,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain,
user_compile_flags = copts,
),
))
return vars
def is_debug_mode(ctx):
# Compilation mode currently defaults to fastbuild. Use that if for some reason the variable is not set
# https://docs.bazel.build/versions/master/command-line-reference.html#flag--compilation_mode
return ctx.var.get("COMPILATION_MODE", "fastbuild") == "dbg"
def get_tools_info(ctx):
"""Takes information about tools paths from cc_toolchain, returns CxxToolsInfo
Args:
ctx: rule context
"""
cc_toolchain = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain,
)
return CxxToolsInfo(
cc = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = C_COMPILE_ACTION_NAME,
),
cxx = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = CPP_COMPILE_ACTION_NAME,
),
cxx_linker_static = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME,
),
cxx_linker_executable = cc_common.get_tool_for_action(
feature_configuration = feature_configuration,
action_name = CPP_LINK_EXECUTABLE_ACTION_NAME,
),
)
def get_flags_info(ctx, link_output_file = None):
"""Takes information about flags from cc_toolchain, returns CxxFlagsInfo
Args:
ctx: rule context
link_output_file: output file to be specified in the link command line
flags
Returns:
CxxFlagsInfo: A provider containing Cxx flags
"""
cc_toolchain_ = find_cpp_toolchain(ctx)
feature_configuration = _configure_features(
ctx = ctx,
cc_toolchain = cc_toolchain_,
)
copts = (ctx.fragments.cpp.copts + ctx.fragments.cpp.conlyopts) or []
cxxopts = (ctx.fragments.cpp.copts + ctx.fragments.cpp.cxxopts) or []
linkopts = ctx.fragments.cpp.linkopts or []
defines = _defines_from_deps(ctx)
flags = CxxFlagsInfo(
cc = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = C_COMPILE_ACTION_NAME,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain_,
preprocessor_defines = defines,
),
),
cxx = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_COMPILE_ACTION_NAME,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain_,
preprocessor_defines = defines,
add_legacy_cxx_options = True,
),
),
cxx_linker_shared = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_LINK_DYNAMIC_LIBRARY_ACTION_NAME,
variables = cc_common.create_link_variables(
cc_toolchain = cc_toolchain_,
feature_configuration = feature_configuration,
is_using_linker = True,
is_linking_dynamic_library = True,
),
),
cxx_linker_static = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME,
variables = cc_common.create_link_variables(
cc_toolchain = cc_toolchain_,
feature_configuration = feature_configuration,
is_using_linker = False,
is_linking_dynamic_library = False,
output_file = link_output_file,
),
),
cxx_linker_executable = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = CPP_LINK_EXECUTABLE_ACTION_NAME,
variables = cc_common.create_link_variables(
cc_toolchain = cc_toolchain_,
feature_configuration = feature_configuration,
is_using_linker = True,
is_linking_dynamic_library = False,
),
),
assemble = cc_common.get_memory_inefficient_command_line(
feature_configuration = feature_configuration,
action_name = ASSEMBLE_ACTION_NAME,
variables = cc_common.create_compile_variables(
feature_configuration = feature_configuration,
cc_toolchain = cc_toolchain_,
preprocessor_defines = defines,
),
),
)
return CxxFlagsInfo(
cc = _add_if_needed(flags.cc, copts),
cxx = _add_if_needed(flags.cxx, cxxopts),
cxx_linker_shared = _add_if_needed(flags.cxx_linker_shared, linkopts),
cxx_linker_static = flags.cxx_linker_static,
cxx_linker_executable = _add_if_needed(flags.cxx_linker_executable, linkopts),
assemble = _add_if_needed(flags.assemble, copts),
)
def _add_if_needed(arr, add_arr):
filtered = []
for to_add in add_arr:
found = False
for existing in arr:
if existing == to_add:
found = True
if not found:
filtered.append(to_add)
return arr + filtered
def absolutize_path_in_str(workspace_name, root_str, text, force = False):
"""Replaces relative paths in [the middle of] 'text', prepending them with 'root_str'. If there is nothing to replace, returns the 'text'.
We only will replace relative paths starting with either 'external/' or '<top-package-name>/',
because we only want to point with absolute paths to external repositories or inside our
current workspace. (And also to limit the possibility of error with such not exact replacing.)
Args:
workspace_name: workspace name
text: the text to do replacement in
root_str: the text to prepend to the found relative path
force: If true, the `root_str` will always be prepended
Returns:
string: A formatted string
"""
new_text = _prefix(text, "external/", root_str)
if new_text == text:
new_text = _prefix(text, workspace_name + "/", root_str)
# absolutize relative by adding our working directory
# this works because we ru on windows under msys now
if force and new_text == text and not text.startswith("/"):
new_text = root_str + "/" + text
return new_text
def _prefix(text, from_str, prefix):
text = text.replace('"', '\\"')
(before, middle, after) = text.partition(from_str)
if not middle or before.endswith("/"):
return text
return before + prefix + middle + after
def _file_name_no_ext(basename):
(before, separator, after) = basename.rpartition(".")
return before
CxxFlagsInfo = _CxxFlagsInfo
CxxToolsInfo = _CxxToolsInfo
LibrariesToLinkInfo = _LibrariesToLinkInfo
absolutize_path_in_str = _absolutize_path_in_str
create_linking_info = _create_linking_info
get_env_vars = _get_env_vars
get_flags_info = _get_flags_info
get_tools_info = _get_tools_info
is_debug_mode = _is_debug_mode
targets_windows = _targets_windows

View File

@ -1,170 +1,13 @@
""" Defines the rule for building external library with CMake
"""
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
load(
"//toolchains/native_tools:tool_access.bzl",
"get_cmake_data",
"get_make_data",
"get_ninja_data",
)
load(
"//tools/build_defs:cc_toolchain_util.bzl",
"get_flags_info",
"get_tools_info",
"is_debug_mode",
)
load(
"//tools/build_defs:detect_root.bzl",
"detect_root",
)
load(
"//tools/build_defs:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load(":cmake_script.bzl", "create_cmake_script")
load("//foreign_cc:defs.bzl", _cmake = "cmake")
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
def _cmake_impl(ctx):
cmake_data = get_cmake_data(ctx)
make_data = get_make_data(ctx)
print_deprecation()
tools_deps = ctx.attr.tools_deps + cmake_data.deps + make_data.deps
ninja_data = get_ninja_data(ctx)
make_commands = ctx.attr.make_commands
if _uses_ninja(ctx.attr.make_commands):
tools_deps += ninja_data.deps
make_commands = [command.replace("ninja", ninja_data.path) for command in make_commands]
attrs = create_attrs(
ctx.attr,
configure_name = "CMake",
create_configure_script = _create_configure_script,
postfix_script = "##copy_dir_contents_to_dir## $$BUILD_TMPDIR$$/$$INSTALL_PREFIX$$ $$INSTALLDIR$$\n" + ctx.attr.postfix_script,
tools_deps = tools_deps,
cmake_path = cmake_data.path,
ninja_path = ninja_data.path,
make_path = make_data.path,
make_commands = make_commands,
)
return cc_external_rule_impl(ctx, attrs)
def _uses_ninja(make_commands):
for command in make_commands:
(before, separator, after) = command.partition(" ")
if before == "ninja":
return True
return False
def _create_configure_script(configureParameters):
ctx = configureParameters.ctx
inputs = configureParameters.inputs
root = detect_root(ctx.attr.lib_source)
if len(ctx.attr.working_directory) > 0:
root = root + "/" + ctx.attr.working_directory
tools = get_tools_info(ctx)
# CMake will replace <TARGET> with the actual output file
flags = get_flags_info(ctx, "<TARGET>")
no_toolchain_file = ctx.attr.cache_entries.get("CMAKE_TOOLCHAIN_FILE") or not ctx.attr.generate_crosstool_file
define_install_prefix = "export INSTALL_PREFIX=\"" + _get_install_prefix(ctx) + "\"\n"
configure_script = create_cmake_script(
workspace_name = ctx.workspace_name,
cmake_path = configureParameters.attrs.cmake_path,
tools = tools,
flags = flags,
install_prefix = "$$INSTALL_PREFIX$$",
root = root,
no_toolchain_file = no_toolchain_file,
user_cache = dict(ctx.attr.cache_entries),
user_env = dict(ctx.attr.env_vars),
options = ctx.attr.cmake_options,
include_dirs = inputs.include_dirs,
is_debug_mode = is_debug_mode(ctx),
)
return define_install_prefix + configure_script
def _get_install_prefix(ctx):
if ctx.attr.install_prefix:
return ctx.attr.install_prefix
if ctx.attr.lib_name:
return ctx.attr.lib_name
return ctx.attr.name
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"cache_entries": attr.string_dict(
doc = (
"CMake cache entries to initialize (they will be passed with -Dkey=value) " +
"Values, defined by the toolchain, will be joined with the values, passed here. " +
"(Toolchain values come first)"
),
mandatory = False,
default = {},
),
"cmake_options": attr.string_list(
doc = "Other CMake options",
mandatory = False,
default = [],
),
"env_vars": attr.string_dict(
doc = (
"CMake environment variable values to join with toolchain-defined. " +
"For example, additional CXXFLAGS."
),
mandatory = False,
default = {},
),
"generate_crosstool_file": attr.bool(
doc = (
"When True, CMake crosstool file will be generated from the toolchain values, " +
"provided cache-entries and env_vars (some values will still be passed as -Dkey=value " +
"and environment variables). " +
"If CMAKE_TOOLCHAIN_FILE cache entry is passed, specified crosstool file will be used " +
"When using this option to cross-compile, it is required to specify CMAKE_SYSTEM_NAME in the " +
"cache_entries"
),
mandatory = False,
default = True,
),
"install_prefix": attr.string(
doc = "Relative install prefix to be passed to CMake in -DCMAKE_INSTALL_PREFIX",
mandatory = False,
),
"working_directory": attr.string(
doc = (
"Working directory, with the main CMakeLists.txt " +
"(otherwise, the top directory of the lib_source label files is used.)"
),
mandatory = False,
default = "",
),
})
return attrs
cmake = rule(
doc = "Rule for building external library with CMake.",
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _cmake_impl,
toolchains = [
"@rules_foreign_cc//toolchains:cmake_toolchain",
"@rules_foreign_cc//toolchains:ninja_toolchain",
"@rules_foreign_cc//toolchains:make_toolchain",
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)
cmake = _cmake
# This is an alias to the underlying rule and is
# kept around for legacy compaitiblity. This should
# not be removed without sufficent warning.
cmake_external = cmake
cmake_external = _cmake

View File

@ -1,312 +1,12 @@
""" Contains all logic for calling CMake for building external libraries/binaries """
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
load(":cc_toolchain_util.bzl", "absolutize_path_in_str")
def create_cmake_script(
workspace_name,
cmake_path,
tools,
flags,
install_prefix,
root,
no_toolchain_file,
user_cache,
user_env,
options,
include_dirs = [],
is_debug_mode = True):
"""Constructs CMake script to be passed to cc_external_rule_impl.
Args:
workspace_name: current workspace name
cmake_path: The path to the cmake executable
tools: cc_toolchain tools (CxxToolsInfo)
flags: cc_toolchain flags (CxxFlagsInfo)
install_prefix: value ot pass to CMAKE_INSTALL_PREFIX
root: sources root relative to the $EXT_BUILD_ROOT
no_toolchain_file: if False, CMake toolchain file will be generated, otherwise not
user_cache: dictionary with user's values of cache initializers
user_env: dictionary with user's values for CMake environment variables
options: other CMake options specified by user
include_dirs: Optional additional include directories. Defaults to [].
is_debug_mode: If the compilation mode is `debug`. Defaults to True.
Returns:
string: A formatted string of the generated build command
"""
merged_prefix_path = _merge_prefix_path(user_cache, include_dirs)
toolchain_dict = _fill_crossfile_from_toolchain(workspace_name, tools, flags)
params = None
keys_with_empty_values_in_user_cache = [key for key in user_cache if user_cache.get(key) == ""]
if no_toolchain_file:
params = _create_cache_entries_env_vars(toolchain_dict, user_cache, user_env)
else:
params = _create_crosstool_file_text(toolchain_dict, user_cache, user_env)
build_type = params.cache.get(
"CMAKE_BUILD_TYPE",
"Debug" if is_debug_mode else "Release",
)
params.cache.update({
"CMAKE_BUILD_TYPE": build_type,
"CMAKE_INSTALL_PREFIX": install_prefix,
"CMAKE_PREFIX_PATH": merged_prefix_path,
})
# Give user the ability to suppress some value, taken from Bazel's toolchain,
# or to suppress calculated CMAKE_BUILD_TYPE
# If the user passes "CMAKE_BUILD_TYPE": "" (empty string),
# CMAKE_BUILD_TYPE will not be passed to CMake
wipe_empty_values(params.cache, keys_with_empty_values_in_user_cache)
# However, if no CMAKE_RANLIB was passed, pass the empty value for it explicitly,
# as it is legacy and autodetection of ranlib made by CMake automatically
# breaks some cross compilation builds,
# see https://github.com/envoyproxy/envoy/pull/6991
if not params.cache.get("CMAKE_RANLIB"):
params.cache.update({"CMAKE_RANLIB": ""})
set_env_vars = " ".join([key + "=\"" + params.env[key] + "\"" for key in params.env])
str_cmake_cache_entries = " ".join(["-D" + key + "=\"" + params.cache[key] + "\"" for key in params.cache])
cmake_call = " ".join([
set_env_vars,
cmake_path,
str_cmake_cache_entries,
" ".join(options),
"$EXT_BUILD_ROOT/" + root,
])
return "\n".join(params.commands + [cmake_call])
def wipe_empty_values(cache, keys_with_empty_values_in_user_cache):
for key in keys_with_empty_values_in_user_cache:
if cache.get(key) != None:
cache.pop(key)
# From CMake documentation: ;-list of directories specifying installation prefixes to be searched...
def _merge_prefix_path(user_cache, include_dirs):
user_prefix = user_cache.get("CMAKE_PREFIX_PATH")
values = ["$EXT_BUILD_DEPS"] + include_dirs
if user_prefix != None:
# remove it, it is gonna be merged specifically
user_cache.pop("CMAKE_PREFIX_PATH")
values.append(user_prefix.strip("\"'"))
return ";".join(values)
_CMAKE_ENV_VARS_FOR_CROSSTOOL = {
"ASMFLAGS": struct(value = "CMAKE_ASM_FLAGS_INIT", replace = False),
"CC": struct(value = "CMAKE_C_COMPILER", replace = True),
"CFLAGS": struct(value = "CMAKE_C_FLAGS_INIT", replace = False),
"CXX": struct(value = "CMAKE_CXX_COMPILER", replace = True),
"CXXFLAGS": struct(value = "CMAKE_CXX_FLAGS_INIT", replace = False),
}
_CMAKE_CACHE_ENTRIES_CROSSTOOL = {
"CMAKE_AR": struct(value = "CMAKE_AR", replace = True),
"CMAKE_ASM_FLAGS": struct(value = "CMAKE_ASM_FLAGS_INIT", replace = False),
"CMAKE_CXX_ARCHIVE_CREATE": struct(value = "CMAKE_CXX_ARCHIVE_CREATE", replace = False),
"CMAKE_CXX_FLAGS": struct(value = "CMAKE_CXX_FLAGS_INIT", replace = False),
"CMAKE_CXX_LINK_EXECUTABLE": struct(value = "CMAKE_CXX_LINK_EXECUTABLE", replace = True),
"CMAKE_C_ARCHIVE_CREATE": struct(value = "CMAKE_C_ARCHIVE_CREATE", replace = False),
"CMAKE_C_FLAGS": struct(value = "CMAKE_C_FLAGS_INIT", replace = False),
"CMAKE_EXE_LINKER_FLAGS": struct(value = "CMAKE_EXE_LINKER_FLAGS_INIT", replace = False),
"CMAKE_RANLIB": struct(value = "CMAKE_RANLIB", replace = True),
"CMAKE_SHARED_LINKER_FLAGS": struct(value = "CMAKE_SHARED_LINKER_FLAGS_INIT", replace = False),
"CMAKE_STATIC_LINKER_FLAGS": struct(value = "CMAKE_STATIC_LINKER_FLAGS_INIT", replace = False),
}
def _create_crosstool_file_text(toolchain_dict, user_cache, user_env):
cache_entries = _dict_copy(user_cache)
env_vars = _dict_copy(user_env)
_move_dict_values(toolchain_dict, env_vars, _CMAKE_ENV_VARS_FOR_CROSSTOOL)
_move_dict_values(toolchain_dict, cache_entries, _CMAKE_CACHE_ENTRIES_CROSSTOOL)
lines = []
for key in toolchain_dict:
if ("CMAKE_AR" == key):
lines.append("set({} \"{}\" {})".format(key, toolchain_dict[key], "CACHE FILEPATH \"Archiver\""))
continue
lines.append("set({} \"{}\")".format(key, toolchain_dict[key]))
cache_entries.update({
"CMAKE_TOOLCHAIN_FILE": "crosstool_bazel.cmake",
})
return struct(
commands = ["cat > crosstool_bazel.cmake <<EOF\n" + "\n".join(sorted(lines)) + "\nEOF\n"],
env = env_vars,
cache = cache_entries,
)
def _dict_copy(d):
out = {}
if d:
out.update(d)
return out
def _create_cache_entries_env_vars(toolchain_dict, user_cache, user_env):
_move_dict_values(toolchain_dict, user_env, _CMAKE_ENV_VARS_FOR_CROSSTOOL)
_move_dict_values(toolchain_dict, user_cache, _CMAKE_CACHE_ENTRIES_CROSSTOOL)
merged_env = _translate_from_toolchain_file(toolchain_dict, _CMAKE_ENV_VARS_FOR_CROSSTOOL)
merged_cache = _translate_from_toolchain_file(toolchain_dict, _CMAKE_CACHE_ENTRIES_CROSSTOOL)
# anything left in user's env_entries does not correspond to anything defined by toolchain
# => simple merge
merged_env.update(user_env)
merged_cache.update(user_cache)
return struct(
commands = [],
env = merged_env,
cache = merged_cache,
)
def _translate_from_toolchain_file(toolchain_dict, descriptor_map):
reverse = _reverse_descriptor_dict(descriptor_map)
cl_keyed_toolchain = dict()
keys = toolchain_dict.keys()
for key in keys:
env_var_key = reverse.get(key)
if env_var_key:
cl_keyed_toolchain[env_var_key.value] = toolchain_dict.pop(key)
return cl_keyed_toolchain
def _merge_toolchain_and_user_values(toolchain_dict, user_dict, descriptor_map):
_move_dict_values(toolchain_dict, user_dict, descriptor_map)
cl_keyed_toolchain = _translate_from_toolchain_file(toolchain_dict, descriptor_map)
# anything left in user's env_entries does not correspond to anything defined by toolchain
# => simple merge
cl_keyed_toolchain.update(user_dict)
return cl_keyed_toolchain
def _reverse_descriptor_dict(dict):
out_dict = {}
for key in dict:
value = dict[key]
out_dict[value.value] = struct(value = key, replace = value.replace)
return out_dict
def _move_dict_values(target, source, descriptor_map):
keys = source.keys()
for key in keys:
existing = descriptor_map.get(key)
if existing:
value = source.pop(key)
if existing.replace or target.get(existing.value) == None:
target[existing.value] = value
else:
target[existing.value] = target[existing.value] + " " + value
def _fill_crossfile_from_toolchain(workspace_name, tools, flags):
dict = {}
_sysroot = _find_in_cc_or_cxx(flags, "sysroot")
if _sysroot:
dict["CMAKE_SYSROOT"] = _absolutize(workspace_name, _sysroot)
_ext_toolchain_cc = _find_flag_value(flags.cc, "gcc_toolchain")
if _ext_toolchain_cc:
dict["CMAKE_C_COMPILER_EXTERNAL_TOOLCHAIN"] = _absolutize(workspace_name, _ext_toolchain_cc)
_ext_toolchain_cxx = _find_flag_value(flags.cxx, "gcc_toolchain")
if _ext_toolchain_cxx:
dict["CMAKE_CXX_COMPILER_EXTERNAL_TOOLCHAIN"] = _absolutize(workspace_name, _ext_toolchain_cxx)
# Force convert tools paths to absolute using $EXT_BUILD_ROOT
if tools.cc:
dict["CMAKE_C_COMPILER"] = _absolutize(workspace_name, tools.cc, True)
if tools.cxx:
dict["CMAKE_CXX_COMPILER"] = _absolutize(workspace_name, tools.cxx, True)
if tools.cxx_linker_static:
dict["CMAKE_AR"] = _absolutize(workspace_name, tools.cxx_linker_static, True)
if tools.cxx_linker_static.endswith("/libtool"):
dict["CMAKE_C_ARCHIVE_CREATE"] = "<CMAKE_AR> %s <OBJECTS>" % \
" ".join(flags.cxx_linker_static)
dict["CMAKE_CXX_ARCHIVE_CREATE"] = "<CMAKE_AR> %s <OBJECTS>" % \
" ".join(flags.cxx_linker_static)
if tools.cxx_linker_executable and tools.cxx_linker_executable != tools.cxx:
normalized_path = _absolutize(workspace_name, tools.cxx_linker_executable)
dict["CMAKE_CXX_LINK_EXECUTABLE"] = " ".join([
normalized_path,
"<FLAGS>",
"<CMAKE_CXX_LINK_FLAGS>",
"<LINK_FLAGS>",
"<OBJECTS>",
"-o <TARGET>",
"<LINK_LIBRARIES>",
])
if flags.cc:
dict["CMAKE_C_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cc)
if flags.cxx:
dict["CMAKE_CXX_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cxx)
if flags.assemble:
dict["CMAKE_ASM_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.assemble)
# todo this options are needed, but cause a bug because the flags are put in wrong order => keep this line
# if flags.cxx_linker_static:
# lines += [_set_list(ctx, "CMAKE_STATIC_LINKER_FLAGS_INIT", flags.cxx_linker_static)]
if flags.cxx_linker_shared:
dict["CMAKE_SHARED_LINKER_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cxx_linker_shared)
if flags.cxx_linker_executable:
dict["CMAKE_EXE_LINKER_FLAGS_INIT"] = _join_flags_list(workspace_name, flags.cxx_linker_executable)
return dict
def _find_in_cc_or_cxx(flags, flag_name_no_dashes):
_value = _find_flag_value(flags.cxx, flag_name_no_dashes)
if _value:
return _value
return _find_flag_value(flags.cc, flag_name_no_dashes)
def _find_flag_value(list, flag_name_no_dashes):
one_dash = "-" + flag_name_no_dashes.lstrip(" ")
two_dash = "--" + flag_name_no_dashes.lstrip(" ")
check_for_value = False
for value in list:
value = value.lstrip(" ")
if check_for_value:
return value.lstrip(" =")
_tail = _tail_if_starts_with(value, one_dash)
_tail = _tail_if_starts_with(value, two_dash) if _tail == None else _tail
if _tail != None and len(_tail) > 0:
return _tail.lstrip(" =")
if _tail != None:
check_for_value = True
return None
def _tail_if_starts_with(str, start):
if (str.startswith(start)):
return str[len(start):]
return None
def _absolutize(workspace_name, text, force = False):
if text.strip(" ").startswith("C:") or text.strip(" ").startswith("c:"):
return text
return absolutize_path_in_str(workspace_name, "$EXT_BUILD_ROOT/", text, force)
def _join_flags_list(workspace_name, flags):
return " ".join([_absolutize(workspace_name, flag) for flag in flags])
export_for_test = struct(
absolutize = _absolutize,
tail_if_starts_with = _tail_if_starts_with,
find_flag_value = _find_flag_value,
fill_crossfile_from_toolchain = _fill_crossfile_from_toolchain,
move_dict_values = _move_dict_values,
reverse_descriptor_dict = _reverse_descriptor_dict,
merge_toolchain_and_user_values = _merge_toolchain_and_user_values,
CMAKE_ENV_VARS_FOR_CROSSTOOL = _CMAKE_ENV_VARS_FOR_CROSSTOOL,
CMAKE_CACHE_ENTRIES_CROSSTOOL = _CMAKE_CACHE_ENTRIES_CROSSTOOL,
# buildifier: disable=bzl-visibility
load(
"//foreign_cc/private:cmake_script.bzl",
_create_cmake_script = "create_cmake_script",
)
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
print_deprecation()
create_cmake_script = _create_cmake_script

View File

@ -1,187 +1,8 @@
# buildifier: disable=module-docstring
load("@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl", "os_name")
load("//toolchains/native_tools:tool_access.bzl", "get_make_data")
load(
"//tools/build_defs:cc_toolchain_util.bzl",
"get_flags_info",
"get_tools_info",
"is_debug_mode",
)
load(
"//tools/build_defs:detect_root.bzl",
"detect_root",
)
load(
"//tools/build_defs:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load(":configure_script.bzl", "create_configure_script")
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
def _configure_make(ctx):
make_data = get_make_data(ctx)
load("//foreign_cc:defs.bzl", _configure_make = "configure_make")
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
tools_deps = ctx.attr.tools_deps + make_data.deps
print_deprecation()
copy_results = "##copy_dir_contents_to_dir## $$BUILD_TMPDIR$$/$$INSTALL_PREFIX$$ $$INSTALLDIR$$\n"
attrs = create_attrs(
ctx.attr,
configure_name = "Configure",
create_configure_script = _create_configure_script,
postfix_script = copy_results + "\n" + ctx.attr.postfix_script,
tools_deps = tools_deps,
make_path = make_data.path,
)
return cc_external_rule_impl(ctx, attrs)
def _create_configure_script(configureParameters):
ctx = configureParameters.ctx
inputs = configureParameters.inputs
root = detect_root(ctx.attr.lib_source)
install_prefix = _get_install_prefix(ctx)
tools = get_tools_info(ctx)
flags = get_flags_info(ctx)
define_install_prefix = "export INSTALL_PREFIX=\"" + _get_install_prefix(ctx) + "\"\n"
configure = create_configure_script(
workspace_name = ctx.workspace_name,
# as default, pass execution OS as target OS
target_os = os_name(ctx),
tools = tools,
flags = flags,
root = root,
user_options = ctx.attr.configure_options,
user_vars = dict(ctx.attr.configure_env_vars),
is_debug = is_debug_mode(ctx),
configure_command = ctx.attr.configure_command,
deps = ctx.attr.deps,
inputs = inputs,
configure_in_place = ctx.attr.configure_in_place,
autoconf = ctx.attr.autoconf,
autoconf_options = ctx.attr.autoconf_options,
autoconf_env_vars = ctx.attr.autoconf_env_vars,
autoreconf = ctx.attr.autoreconf,
autoreconf_options = ctx.attr.autoreconf_options,
autoreconf_env_vars = ctx.attr.autoreconf_env_vars,
autogen = ctx.attr.autogen,
autogen_command = ctx.attr.autogen_command,
autogen_options = ctx.attr.autogen_options,
autogen_env_vars = ctx.attr.autogen_env_vars,
)
return "\n".join([define_install_prefix, configure])
def _get_install_prefix(ctx):
if ctx.attr.install_prefix:
return ctx.attr.install_prefix
if ctx.attr.lib_name:
return ctx.attr.lib_name
return ctx.attr.name
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"autoconf": attr.bool(
mandatory = False,
default = False,
doc = (
"Set to True if 'autoconf' should be invoked before 'configure', " +
"currently requires 'configure_in_place' to be True."
),
),
"autoconf_env_vars": attr.string_dict(
doc = "Environment variables to be set for 'autoconf' invocation.",
),
"autoconf_options": attr.string_list(
doc = "Any options to be put in the 'autoconf.sh' command line.",
),
"autogen": attr.bool(
doc = (
"Set to True if 'autogen.sh' should be invoked before 'configure', " +
"currently requires 'configure_in_place' to be True."
),
mandatory = False,
default = False,
),
"autogen_command": attr.string(
doc = (
"The name of the autogen script file, default: autogen.sh. " +
"Many projects use autogen.sh however the Autotools FAQ recommends bootstrap " +
"so we provide this option to support that."
),
default = "autogen.sh",
),
"autogen_env_vars": attr.string_dict(
doc = "Environment variables to be set for 'autogen' invocation.",
),
"autogen_options": attr.string_list(
doc = "Any options to be put in the 'autogen.sh' command line.",
),
"autoreconf": attr.bool(
doc = (
"Set to True if 'autoreconf' should be invoked before 'configure.', " +
"currently requires 'configure_in_place' to be True."
),
mandatory = False,
default = False,
),
"autoreconf_env_vars": attr.string_dict(
doc = "Environment variables to be set for 'autoreconf' invocation.",
),
"autoreconf_options": attr.string_list(
doc = "Any options to be put in the 'autoreconf.sh' command line.",
),
"configure_command": attr.string(
doc = (
"The name of the configuration script file, default: configure. " +
"The file must be in the root of the source directory."
),
default = "configure",
),
"configure_env_vars": attr.string_dict(
doc = "Environment variables to be set for the 'configure' invocation.",
),
"configure_in_place": attr.bool(
doc = (
"Set to True if 'configure' should be invoked in place, i.e. from its enclosing " +
"directory."
),
mandatory = False,
default = False,
),
"configure_options": attr.string_list(
doc = "Any options to be put on the 'configure' command line.",
),
"install_prefix": attr.string(
doc = (
"Install prefix, i.e. relative path to where to install the result of the build. " +
"Passed to the 'configure' script with --prefix flag."
),
mandatory = False,
),
})
return attrs
configure_make = rule(
doc = (
"Rule for building external libraries with configure-make pattern. " +
"Some 'configure' script is invoked with --prefix=install (by default), " +
"and other parameters for compilation and linking, taken from Bazel C/C++ " +
"toolchain and passed dependencies. " +
"After configuration, GNU Make is called."
),
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _configure_make,
toolchains = [
"@rules_foreign_cc//toolchains:make_toolchain",
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)
configure_make = _configure_make

View File

@ -1,238 +1,16 @@
# buildifier: disable=module-docstring
load(":cc_toolchain_util.bzl", "absolutize_path_in_str")
load(":framework.bzl", "get_foreign_cc_dep")
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
def _pkgconfig_script(ext_build_dirs):
"""Create a script fragment to configure pkg-config"""
script = []
if ext_build_dirs:
for ext_dir in ext_build_dirs:
script.append("##increment_pkg_config_path## $$EXT_BUILD_DEPS$$/" + ext_dir.basename)
script.append("echo \"PKG_CONFIG_PATH=$${PKG_CONFIG_PATH:-}$$\"")
# buildifier: disable=bzl-visibility
load(
"//foreign_cc/private:configure_script.bzl",
_create_configure_script = "create_configure_script",
_create_make_script = "create_make_script",
_get_env_vars = "get_env_vars",
)
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
script.append("##define_absolute_paths## $$EXT_BUILD_DEPS$$ $$EXT_BUILD_DEPS$$")
print_deprecation()
return script
# buildifier: disable=function-docstring
def create_configure_script(
workspace_name,
target_os,
tools,
flags,
root,
user_options,
user_vars,
is_debug,
configure_command,
deps,
inputs,
configure_in_place,
autoconf,
autoconf_options,
autoconf_env_vars,
autoreconf,
autoreconf_options,
autoreconf_env_vars,
autogen,
autogen_command,
autogen_options,
autogen_env_vars):
env_vars_string = get_env_vars(workspace_name, tools, flags, user_vars, deps, inputs)
ext_build_dirs = inputs.ext_build_dirs
script = _pkgconfig_script(ext_build_dirs)
root_path = "$$EXT_BUILD_ROOT$$/{}".format(root)
configure_path = "{}/{}".format(root_path, configure_command)
if configure_in_place:
script.append("##symlink_contents_to_dir## $$EXT_BUILD_ROOT$$/{} $$BUILD_TMPDIR$$".format(root))
root_path = "$$BUILD_TMPDIR$$"
configure_path = "{}/{}".format(root_path, configure_command)
if autogen and configure_in_place:
# NOCONFIGURE is pseudo standard and tells the script to not invoke configure.
# We explicitly invoke configure later.
autogen_env_vars = _get_autogen_env_vars(autogen_env_vars)
script.append("{} \"{}/{}\" {}".format(
" ".join(["{}=\"{}\"".format(key, autogen_env_vars[key]) for key in autogen_env_vars]),
root_path,
autogen_command,
" ".join(autogen_options),
).lstrip())
if autoconf and configure_in_place:
script.append("{} autoconf {}".format(
" ".join(["{}=\"{}\"".format(key, autoconf_env_vars[key]) for key in autoconf_env_vars]),
" ".join(autoconf_options),
).lstrip())
if autoreconf and configure_in_place:
script.append("{} autoreconf {}".format(
" ".join(["{}=\"{}\"".format(key, autoreconf_env_vars[key]) for key in autoreconf_env_vars]),
" ".join(autoreconf_options),
).lstrip())
script.append("{env_vars} \"{configure}\" --prefix=$$BUILD_TMPDIR$$/$$INSTALL_PREFIX$$ {user_options}".format(
env_vars = env_vars_string,
configure = configure_path,
user_options = " ".join(user_options),
))
return "\n".join(script)
# buildifier: disable=function-docstring
def create_make_script(
workspace_name,
tools,
flags,
root,
user_vars,
deps,
inputs,
make_commands,
prefix):
env_vars_string = get_env_vars(workspace_name, tools, flags, user_vars, deps, inputs)
ext_build_dirs = inputs.ext_build_dirs
script = _pkgconfig_script(ext_build_dirs)
script.append("##symlink_contents_to_dir## $$EXT_BUILD_ROOT$$/{} $$BUILD_TMPDIR$$".format(root))
script.extend(make_commands)
return "\n".join(script)
# buildifier: disable=function-docstring
def get_env_vars(
workspace_name,
tools,
flags,
user_vars,
deps,
inputs):
vars = _get_configure_variables(tools, flags, user_vars)
deps_flags = _define_deps_flags(deps, inputs)
if "LDFLAGS" in vars.keys():
vars["LDFLAGS"] = vars["LDFLAGS"] + deps_flags.libs
else:
vars["LDFLAGS"] = deps_flags.libs
# -I flags should be put into preprocessor flags, CPPFLAGS
# https://www.gnu.org/software/autoconf/manual/autoconf-2.63/html_node/Preset-Output-Variables.html
vars["CPPFLAGS"] = deps_flags.flags
return " ".join(["{}=\"{}\""
.format(key, _join_flags_list(workspace_name, vars[key])) for key in vars])
def _get_autogen_env_vars(autogen_env_vars):
# Make a copy if necessary so we can set NOCONFIGURE.
if autogen_env_vars.get("NOCONFIGURE"):
return autogen_env_vars
vars = {}
for key in autogen_env_vars:
vars[key] = autogen_env_vars.get(key)
vars["NOCONFIGURE"] = "1"
return vars
def _define_deps_flags(deps, inputs):
# It is very important to keep the order for the linker => put them into list
lib_dirs = []
# Here go libraries built with Bazel
gen_dirs_set = {}
for lib in inputs.libs:
dir_ = lib.dirname
if not gen_dirs_set.get(dir_):
gen_dirs_set[dir_] = 1
lib_dirs.append("-L$$EXT_BUILD_ROOT$$/" + dir_)
include_dirs_set = {}
for include_dir in inputs.include_dirs:
include_dirs_set[include_dir] = "-I$$EXT_BUILD_ROOT$$/" + include_dir
for header in inputs.headers:
include_dir = header.dirname
if not include_dirs_set.get(include_dir):
include_dirs_set[include_dir] = "-I$$EXT_BUILD_ROOT$$/" + include_dir
include_dirs = include_dirs_set.values()
# For the external libraries, we need to refer to the places where
# we copied the dependencies ($EXT_BUILD_DEPS/<lib_name>), because
# we also want configure to find that same files with pkg-config
# -config or other mechanics.
# Since we need the names of include and lib directories under
# the $EXT_BUILD_DEPS/<lib_name>, we ask the provider.
gen_dirs_set = {}
for dep in deps:
external_deps = get_foreign_cc_dep(dep)
if external_deps:
for artifact in external_deps.artifacts.to_list():
if not gen_dirs_set.get(artifact.gen_dir):
gen_dirs_set[artifact.gen_dir] = 1
dir_name = artifact.gen_dir.basename
include_dirs.append("-I$$EXT_BUILD_DEPS$$/{}/{}".format(dir_name, artifact.include_dir_name))
lib_dirs.append("-L$$EXT_BUILD_DEPS$$/{}/{}".format(dir_name, artifact.lib_dir_name))
return struct(
libs = lib_dirs,
flags = include_dirs,
)
# See https://www.gnu.org/software/make/manual/html_node/Implicit-Variables.html
_CONFIGURE_FLAGS = {
"ARFLAGS": "cxx_linker_static",
"ASFLAGS": "assemble",
"CFLAGS": "cc",
"CXXFLAGS": "cxx",
"LDFLAGS": "cxx_linker_executable",
# missing: cxx_linker_shared
}
_CONFIGURE_TOOLS = {
"AR": "cxx_linker_static",
"CC": "cc",
"CXX": "cxx",
# missing: cxx_linker_executable
}
def _get_configure_variables(tools, flags, user_env_vars):
vars = {}
for flag in _CONFIGURE_FLAGS:
flag_value = getattr(flags, _CONFIGURE_FLAGS[flag])
if flag_value:
vars[flag] = flag_value
# Merge flags lists
for user_var in user_env_vars:
toolchain_val = vars.get(user_var)
if toolchain_val:
vars[user_var] = toolchain_val + [user_env_vars[user_var]]
tools_dict = {}
for tool in _CONFIGURE_TOOLS:
tool_value = getattr(tools, _CONFIGURE_TOOLS[tool])
if tool_value:
tools_dict[tool] = [tool_value]
# Replace tools paths if user passed other values
for user_var in user_env_vars:
toolchain_val = tools_dict.get(user_var)
if toolchain_val:
tools_dict[user_var] = [user_env_vars[user_var]]
vars.update(tools_dict)
# Put all other environment variables, passed by the user
for user_var in user_env_vars:
if not vars.get(user_var):
vars[user_var] = [user_env_vars[user_var]]
return vars
def _absolutize(workspace_name, text):
return absolutize_path_in_str(workspace_name, "$$EXT_BUILD_ROOT$$/", text)
def _join_flags_list(workspace_name, flags):
return " ".join([_absolutize(workspace_name, flag) for flag in flags])
create_configure_script = _create_configure_script
create_make_script = _create_make_script
get_env_vars = _get_env_vars

View File

@ -0,0 +1,10 @@
"""A helper module to inform users this package is deprecated"""
def print_deprecation():
# buildifier: disable=print
print(
"`@rules_foreign_cc//tools/build_defs/...` is deprecated, please " +
"find the relevant symbols in `@rules_foreign_cc//foreign_cc/...`. " +
"Note that the core rules can now be loaded from " +
"`@rules_foreign_cc//foreign_cc:defs.bzl`",
)

View File

@ -1,85 +1,14 @@
# buildifier: disable=module-docstring
# buildifier: disable=function-docstring-header
def detect_root(source):
"""Detects the path to the topmost directory of the 'source' outputs.
To be used with external build systems to point to the source code/tools directories.
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
Args:
source (Target): A filegroup of source files
# buildifier: disable=bzl-visibility
load(
"//foreign_cc/private:detect_root.bzl",
_detect_root = "detect_root",
_filter_containing_dirs_from_inputs = "filter_containing_dirs_from_inputs",
)
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
Returns:
string: The relative path to the root source directory
"""
print_deprecation()
sources = source.files.to_list()
if len(sources) == 0:
return ""
root = None
level = -1
# find topmost directory
for file in sources:
file_level = _get_level(file.path)
# If there is no level set or the current file's level
# is greather than what we have logged, update the root
if level == -1 or level > file_level:
root = file
level = file_level
if not root:
fail("No root source or directory was found")
if root.is_source:
return root.dirname
# Note this code path will never be hit due to a bug upstream Bazel
# https://github.com/bazelbuild/bazel/issues/12954
# If the root is not a source file, it must be a directory.
# Thus the path is returned
return root.path
def _get_level(path):
"""Determine the number of sub directories `path` is contained in
Args:
path (string): The target path
Returns:
int: The directory depth of `path`
"""
normalized = path
# This for loop ensures there are no double `//` substrings.
# A for loop is used because there's not currently a `while`
# or a better mechanism for guaranteeing all `//` have been
# cleaned up.
for i in range(len(path)):
new_normalized = normalized.replace("//", "/")
if len(new_normalized) == len(normalized):
break
normalized = new_normalized
return normalized.count("/")
# buildifier: disable=function-docstring-header
# buildifier: disable=function-docstring-args
# buildifier: disable=function-docstring-return
def filter_containing_dirs_from_inputs(input_files_list):
"""When the directories are also passed in the filegroup with the sources,
we get into a situation when we have containing in the sources list,
which is not allowed by Bazel (execroot creation code fails).
The parent directories will be created for us in the execroot anyway,
so we filter them out."""
# This puts directories in front of their children in list
sorted_list = sorted(input_files_list)
contains_map = {}
for input in input_files_list:
# If the immediate parent directory is already in the list, remove it
if contains_map.get(input.dirname):
contains_map.pop(input.dirname)
contains_map[input.path] = input
return contains_map.values()
detect_root = _detect_root
filter_containing_dirs_from_inputs = _filter_containing_dirs_from_inputs

View File

@ -1,894 +1,39 @@
""" Contains definitions for creation of external C/C++ build rules (for building external libraries
with CMake, configure/make, autotools)
"""
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
load("@bazel_skylib//lib:collections.bzl", "collections")
load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain")
load("@rules_foreign_cc//tools/build_defs:detect_root.bzl", "detect_root", "filter_containing_dirs_from_inputs")
# buildifier: disable=bzl-visibility
load(
"@rules_foreign_cc//tools/build_defs:run_shell_file_utils.bzl",
"copy_directory",
"fictive_file_in_genroot",
)
load(
"@rules_foreign_cc//tools/build_defs:shell_script_helper.bzl",
"convert_shell_script",
"create_function",
"os_name",
)
load(
":cc_toolchain_util.bzl",
"LibrariesToLinkInfo",
"create_linking_info",
"get_env_vars",
"targets_windows",
"//foreign_cc/private:framework.bzl",
_CC_EXTERNAL_RULE_ATTRIBUTES = "CC_EXTERNAL_RULE_ATTRIBUTES",
_ConfigureParameters = "ConfigureParameters",
_InputFiles = "InputFiles",
_WrappedOutputs = "WrappedOutputs",
_cc_external_rule_impl = "cc_external_rule_impl",
_create_attrs = "create_attrs",
_get_foreign_cc_dep = "get_foreign_cc_dep",
_uniq_list_keep_order = "uniq_list_keep_order",
_wrap_outputs = "wrap_outputs",
)
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
# Dict with definitions of the context attributes, that customize cc_external_rule_impl function.
# Many of the attributes have default values.
#
# Typically, the concrete external library rule will use this structure to create the attributes
# description dict. See cmake.bzl as an example.
#
CC_EXTERNAL_RULE_ATTRIBUTES = {
"additional_inputs": attr.label_list(
doc = (
"Optional additional inputs to be declared as needed for the shell script action." +
"Not used by the shell script part in cc_external_rule_impl."
),
mandatory = False,
allow_files = True,
default = [],
),
"additional_tools": attr.label_list(
doc = (
"Optional additional tools needed for the building. " +
"Not used by the shell script part in cc_external_rule_impl."
),
mandatory = False,
allow_files = True,
default = [],
),
"alwayslink": attr.bool(
doc = (
"Optional. if true, link all the object files from the static library, " +
"even if they are not used."
),
mandatory = False,
default = False,
),
"binaries": attr.string_list(
doc = "__deprecated__: Use `out_binaries` instead.",
),
"data": attr.label_list(
doc = "Files needed by this rule at runtime. May list file or rule targets. Generally allows any target.",
mandatory = False,
allow_files = True,
default = [],
),
"defines": attr.string_list(
doc = (
"Optional compilation definitions to be passed to the dependencies of this library. " +
"They are NOT passed to the compiler, you should duplicate them in the configuration options."
),
mandatory = False,
default = [],
),
"deps": attr.label_list(
doc = (
"Optional dependencies to be copied into the directory structure. " +
"Typically those directly required for the external building of the library/binaries. " +
"(i.e. those that the external buidl system will be looking for and paths to which are " +
"provided by the calling rule)"
),
mandatory = False,
allow_files = True,
default = [],
),
"env": attr.string_dict(
doc = (
"Environment variables to set during the build. " +
"$(execpath) macros may be used to point at files which are listed as data deps, tools_deps, or additional_tools, " +
"but unlike with other rules, these will be replaced with absolute paths to those files, " +
"because the build does not run in the exec root. " +
"No other macros are supported."
),
),
"headers_only": attr.bool(
doc = "__deprecated__: Use `out_headers_only` instead.",
mandatory = False,
default = False,
),
"interface_libraries": attr.string_list(
doc = "__deprecated__: Use `out_interface_libs` instead.",
mandatory = False,
),
"lib_name": attr.string(
doc = (
"Library name. Defines the name of the install directory and the name of the static library, " +
"if no output files parameters are defined (any of static_libraries, shared_libraries, " +
"interface_libraries, binaries_names) " +
"Optional. If not defined, defaults to the target's name."
),
mandatory = False,
),
"lib_source": attr.label(
doc = (
"Label with source code to build. Typically a filegroup for the source of remote repository. " +
"Mandatory."
),
mandatory = True,
allow_files = True,
),
"linkopts": attr.string_list(
doc = "Optional link options to be passed up to the dependencies of this library",
mandatory = False,
default = [],
),
"make_commands": attr.string_list(
doc = "Optinal make commands, defaults to [\"make\", \"make install\"]",
mandatory = False,
default = ["make", "make install"],
),
"out_bin_dir": attr.string(
doc = "Optional name of the output subdirectory with the binary files, defaults to 'bin'.",
mandatory = False,
default = "bin",
),
"out_binaries": attr.string_list(
doc = "Optional names of the resulting binaries.",
mandatory = False,
),
"out_headers_only": attr.bool(
doc = "Flag variable to indicate that the library produces only headers",
mandatory = False,
default = False,
),
"out_include_dir": attr.string(
doc = "Optional name of the output subdirectory with the header files, defaults to 'include'.",
mandatory = False,
default = "include",
),
"out_interface_libs": attr.string_list(
doc = "Optional names of the resulting interface libraries.",
mandatory = False,
),
"out_lib_dir": attr.string(
doc = "Optional name of the output subdirectory with the library files, defaults to 'lib'.",
mandatory = False,
default = "lib",
),
"out_shared_libs": attr.string_list(
doc = "Optional names of the resulting shared libraries.",
mandatory = False,
),
"out_static_libs": attr.string_list(
doc = (
"Optional names of the resulting static libraries. Note that if `out_headers_only`, `out_static_libs`, " +
"`out_shared_libs`, and `out_binaries` are not set, default `lib_name.a`/`lib_name.lib` static " +
"library is assumed"
),
mandatory = False,
),
"postfix_script": attr.string(
doc = "Optional part of the shell script to be added after the make commands",
mandatory = False,
),
"shared_libraries": attr.string_list(
doc = "__deprecated__: Use `out_shared_libs` instead.",
mandatory = False,
),
"static_libraries": attr.string_list(
doc = "__deprecated__: Use `out_static_libs` instead.",
mandatory = False,
),
"tools_deps": attr.label_list(
doc = (
"Optional tools to be copied into the directory structure. " +
"Similar to deps, those directly required for the external building of the library/binaries."
),
mandatory = False,
allow_files = True,
cfg = "host",
default = [],
),
# we need to declare this attribute to access cc_toolchain
"_cc_toolchain": attr.label(
default = Label("@bazel_tools//tools/cpp:current_cc_toolchain"),
),
}
print_deprecation()
# A list of common fragments required by rules using this framework
CC_EXTERNAL_RULE_FRAGMENTS = [
"cpp",
]
CC_EXTERNAL_RULE_ATTRIBUTES = _CC_EXTERNAL_RULE_ATTRIBUTES
# buildifier: disable=function-docstring-header
# buildifier: disable=function-docstring-args
# buildifier: disable=function-docstring-return
def create_attrs(attr_struct, configure_name, create_configure_script, **kwargs):
"""Function for adding/modifying context attributes struct (originally from ctx.attr),
provided by user, to be passed to the cc_external_rule_impl function as a struct.
Copies a struct 'attr_struct' values (with attributes from CC_EXTERNAL_RULE_ATTRIBUTES)
to the resulting struct, adding or replacing attributes passed in 'configure_name',
'configure_script', and '**kwargs' parameters.
"""
attrs = {}
for key in CC_EXTERNAL_RULE_ATTRIBUTES:
if not key.startswith("_") and hasattr(attr_struct, key):
attrs[key] = getattr(attr_struct, key)
attrs["configure_name"] = configure_name
attrs["create_configure_script"] = create_configure_script
for arg in kwargs:
attrs[arg] = kwargs[arg]
return struct(**attrs)
cc_external_rule_impl = _cc_external_rule_impl
# buildifier: disable=name-conventions
ForeignCcDeps = provider(
doc = """Provider to pass transitive information about external libraries.""",
fields = {"artifacts": "Depset of ForeignCcArtifact"},
)
ConfigureParameters = _ConfigureParameters
create_attrs = _create_attrs
get_foreign_cc_dep = _get_foreign_cc_dep
# buildifier: disable=name-conventions
ForeignCcArtifact = provider(
doc = """Groups information about the external library install directory,
and relative bin, include and lib directories.
InputFiles = _InputFiles
Serves to pass transitive information about externally built artifacts up the dependency chain.
uniq_list_keep_order = _uniq_list_keep_order
Can not be used as a top-level provider.
Instances of ForeignCcArtifact are incapsulated in a depset ForeignCcDeps#artifacts.""",
fields = {
"bin_dir_name": "Bin directory, relative to install directory",
"gen_dir": "Install directory",
"include_dir_name": "Include directory, relative to install directory",
"lib_dir_name": "Lib directory, relative to install directory",
},
)
wrap_outputs = _wrap_outputs
# buildifier: disable=name-conventions
ConfigureParameters = provider(
doc = """Parameters of create_configure_script callback function, called by
cc_external_rule_impl function. create_configure_script creates the configuration part
of the script, and allows to reuse the inputs structure, created by the framework.""",
fields = dict(
ctx = "Rule context",
attrs = """Attributes struct, created by create_attrs function above""",
inputs = """InputFiles provider: summarized information on rule inputs, created by framework
function, to be reused in script creator. Contains in particular merged compilation and linking
dependencies.""",
),
)
def cc_external_rule_impl(ctx, attrs):
"""Framework function for performing external C/C++ building.
To be used to build external libraries or/and binaries with CMake, configure/make, autotools etc.,
and use results in Bazel.
It is possible to use it to build a group of external libraries, that depend on each other or on
Bazel library, and pass nessesary tools.
Accepts the actual commands for build configuration/execution in attrs.
Creates and runs a shell script, which:
1. prepares directory structure with sources, dependencies, and tools symlinked into subdirectories
of the execroot directory. Adds tools into PATH.
2. defines the correct absolute paths in tools with the script paths, see 7
3. defines the following environment variables:
EXT_BUILD_ROOT: execroot directory
EXT_BUILD_DEPS: subdirectory of execroot, which contains the following subdirectories:
For cmake_external built dependencies:
symlinked install directories of the dependencies
for Bazel built/imported dependencies:
include - here the include directories are symlinked
lib - here the library files are symlinked
lib/pkgconfig - here the pkgconfig files are symlinked
bin - here the tools are copied
INSTALLDIR: subdirectory of the execroot (named by the lib_name), where the library/binary
will be installed
These variables should be used by the calling rule to refer to the created directory structure.
4. calls 'attrs.create_configure_script'
5. calls 'attrs.make_commands'
6. calls 'attrs.postfix_script'
7. replaces absolute paths in possibly created scripts with a placeholder value
Please see cmake.bzl for example usage.
Args:
ctx: calling rule context
attrs: attributes struct, created by create_attrs function above.
Contains fields from CC_EXTERNAL_RULE_ATTRIBUTES (see descriptions there),
two mandatory fields:
- configure_name: name of the configuration tool, to be used in action mnemonic,
- create_configure_script(ConfigureParameters): function that creates configuration
script, accepts ConfigureParameters
and some other fields provided by the rule, which have been passed to create_attrs.
Returns:
A list of providers
"""
lib_name = attrs.lib_name or ctx.attr.name
inputs = _define_inputs(attrs)
outputs = _define_outputs(ctx, attrs, lib_name)
out_cc_info = _define_out_cc_info(ctx, attrs, inputs, outputs)
cc_env = _correct_path_variable(get_env_vars(ctx))
set_cc_envs = ""
execution_os_name = os_name(ctx)
if execution_os_name != "osx":
set_cc_envs = "\n".join(["export {}=\"{}\"".format(key, cc_env[key]) for key in cc_env])
lib_header = "Bazel external C/C++ Rules. Building library '{}'".format(lib_name)
# We can not declare outputs of the action, which are in parent-child relashion,
# so we need to have a (symlinked) copy of the output directory to provide
# both the C/C++ artifacts - libraries, headers, and binaries,
# and the install directory as a whole (which is mostly nessesary for chained external builds).
#
# We want the install directory output of this rule to have the same name as the library,
# so symlink it under the same name but in a subdirectory
installdir_copy = copy_directory(ctx.actions, "$$INSTALLDIR$$", "copy_{}/{}".format(lib_name, lib_name))
# we need this fictive file in the root to get the path of the root in the script
empty = fictive_file_in_genroot(ctx.actions, ctx.label.name)
data_dependencies = ctx.attr.data + ctx.attr.tools_deps + ctx.attr.additional_tools
define_variables = [
set_cc_envs,
"export EXT_BUILD_ROOT=##pwd##",
"export INSTALLDIR=$$EXT_BUILD_ROOT$$/" + empty.file.dirname + "/" + lib_name,
"export BUILD_TMPDIR=$${INSTALLDIR}$$.build_tmpdir",
"export EXT_BUILD_DEPS=$${INSTALLDIR}$$.ext_build_deps",
] + [
"export {key}={value}".format(
key = key,
# Prepend the exec root to each $(execpath ) lookup because the working directory will not be the exec root.
value = ctx.expand_location(value.replace("$(execpath ", "$$EXT_BUILD_ROOT$$/$(execpath "), data_dependencies),
)
for key, value in getattr(ctx.attr, "env", {}).items()
]
make_commands = []
for line in attrs.make_commands:
if line == "make" or line.startswith("make "):
make_commands.append(line.replace("make", attrs.make_path, 1))
else:
make_commands.append(line)
script_lines = [
"##echo## \"\"",
"##echo## \"{}\"".format(lib_header),
"##echo## \"\"",
"##script_prelude##",
"\n".join(define_variables),
"##path## $$EXT_BUILD_ROOT$$",
"##mkdirs## $$INSTALLDIR$$",
"##mkdirs## $$BUILD_TMPDIR$$",
"##mkdirs## $$EXT_BUILD_DEPS$$",
_print_env(),
"\n".join(_copy_deps_and_tools(inputs)),
"cd $$BUILD_TMPDIR$$",
attrs.create_configure_script(ConfigureParameters(ctx = ctx, attrs = attrs, inputs = inputs)),
"\n".join(make_commands),
attrs.postfix_script or "",
# replace references to the root directory when building ($BUILD_TMPDIR)
# and the root where the dependencies were installed ($EXT_BUILD_DEPS)
# for the results which are in $INSTALLDIR (with placeholder)
"##replace_absolute_paths## $$INSTALLDIR$$ $$BUILD_TMPDIR$$",
"##replace_absolute_paths## $$INSTALLDIR$$ $$EXT_BUILD_DEPS$$",
installdir_copy.script,
empty.script,
"cd $$EXT_BUILD_ROOT$$",
]
script_text = "\n".join([
"#!/usr/bin/env bash",
convert_shell_script(ctx, script_lines),
])
wrapped_outputs = wrap_outputs(ctx, lib_name, attrs.configure_name, script_text)
rule_outputs = outputs.declared_outputs + [installdir_copy.file]
cc_toolchain = find_cpp_toolchain(ctx)
execution_requirements = {"block-network": ""}
if "requires-network" in ctx.attr.tags:
execution_requirements = {"requires-network": ""}
# We need to create a native batch script on windows to ensure the wrapper
# script can correctly be envoked with bash.
wrapper = wrapped_outputs.wrapper_script_file
extra_tools = []
if "win" in execution_os_name:
batch_wrapper = ctx.actions.declare_file(wrapper.short_path + ".bat")
ctx.actions.write(
output = batch_wrapper,
content = "\n".join([
"@ECHO OFF",
"START /b /wait bash {}".format(wrapper.path),
"",
]),
is_executable = True,
)
extra_tools.append(wrapper)
wrapper = batch_wrapper
ctx.actions.run(
mnemonic = "Cc" + attrs.configure_name.capitalize() + "MakeRule",
inputs = depset(inputs.declared_inputs),
outputs = rule_outputs + [
empty.file,
wrapped_outputs.log_file,
],
tools = depset(
[wrapped_outputs.script_file] + extra_tools + ctx.files.data + ctx.files.tools_deps + ctx.files.additional_tools,
transitive = [cc_toolchain.all_files] + [data[DefaultInfo].default_runfiles.files for data in data_dependencies],
),
# TODO: Default to never using the default shell environment to make builds more hermetic. For now, every platform
# but MacOS will take the default PATH passed by Bazel, not that from cc_toolchain.
use_default_shell_env = execution_os_name != "osx",
executable = wrapper,
execution_requirements = execution_requirements,
# this is ignored if use_default_shell_env = True
env = cc_env,
)
# Gather runfiles transitively as per the documentation in:
# https://docs.bazel.build/versions/master/skylark/rules.html#runfiles
runfiles = ctx.runfiles(files = ctx.files.data)
for target in [ctx.attr.lib_source] + ctx.attr.additional_inputs + ctx.attr.deps + ctx.attr.data:
runfiles = runfiles.merge(target[DefaultInfo].default_runfiles)
externally_built = ForeignCcArtifact(
gen_dir = installdir_copy.file,
bin_dir_name = attrs.out_bin_dir,
lib_dir_name = attrs.out_lib_dir,
include_dir_name = attrs.out_include_dir,
)
output_groups = _declare_output_groups(installdir_copy.file, outputs.out_binary_files)
wrapped_files = [
wrapped_outputs.script_file,
wrapped_outputs.log_file,
wrapped_outputs.wrapper_script_file,
]
output_groups[attrs.configure_name + "_logs"] = wrapped_files
return [
DefaultInfo(
files = depset(direct = rule_outputs),
runfiles = runfiles,
),
OutputGroupInfo(**output_groups),
ForeignCcDeps(artifacts = depset(
[externally_built],
transitive = _get_transitive_artifacts(attrs.deps),
)),
CcInfo(
compilation_context = out_cc_info.compilation_context,
linking_context = out_cc_info.linking_context,
),
]
# buildifier: disable=name-conventions
WrappedOutputs = provider(
doc = "Structure for passing the log and scripts file information, and wrapper script text.",
fields = {
"log_file": "Execution log file",
"script_file": "Main script file",
"wrapper_script": "Wrapper script text to execute",
"wrapper_script_file": "Wrapper script file (output for debugging purposes)",
},
)
# buildifier: disable=function-docstring
def wrap_outputs(ctx, lib_name, configure_name, script_text):
build_log_file = ctx.actions.declare_file("{}_logs/{}.log".format(lib_name, configure_name))
wrapper_script_file = ctx.actions.declare_file("{}_scripts/{}_wrapper_script.sh".format(lib_name, configure_name))
build_script_file = ctx.actions.declare_file("{}_scripts/{}_script.sh".format(lib_name, configure_name))
ctx.actions.write(
output = build_script_file,
content = script_text,
is_executable = True,
)
cleanup_on_success_function = create_function(
ctx,
"cleanup_on_success",
"rm -rf $BUILD_TMPDIR $EXT_BUILD_DEPS",
)
cleanup_on_failure_function = create_function(
ctx,
"cleanup_on_failure",
"\n".join([
"##echo## \"rules_foreign_cc: Build failed!\"",
"##echo## \"rules_foreign_cc: Keeping temp build directory $$BUILD_TMPDIR$$ and dependencies directory $$EXT_BUILD_DEPS$$ for debug.\"",
"##echo## \"rules_foreign_cc: Please note that the directories inside a sandbox are still cleaned unless you specify '--sandbox_debug' Bazel command line flag.\"",
"##echo## \"rules_foreign_cc: Printing build logs:\"",
"##echo## \"_____ BEGIN BUILD LOGS _____\"",
"##cat## $$BUILD_LOG$$",
"##echo## \"_____ END BUILD LOGS _____\"",
"##echo## \"rules_foreign_cc: Build wrapper script location: $$BUILD_WRAPPER_SCRIPT$$\"",
"##echo## \"rules_foreign_cc: Build script location: $$BUILD_SCRIPT$$\"",
"##echo## \"rules_foreign_cc: Build log location: $$BUILD_LOG$$\"",
"##echo## \"\"",
]),
)
trap_function = "##cleanup_function## cleanup_on_success cleanup_on_failure"
build_command_lines = [
"##assert_script_errors##",
cleanup_on_success_function,
cleanup_on_failure_function,
# the call trap is defined inside, in a way how the shell function should be called
# see, for instance, linux_commands.bzl
trap_function,
"export BUILD_WRAPPER_SCRIPT=\"{}\"".format(wrapper_script_file.path),
"export BUILD_SCRIPT=\"{}\"".format(build_script_file.path),
"export BUILD_LOG=\"{}\"".format(build_log_file.path),
# sometimes the log file is not created, we do not want our script to fail because of this
"##touch## $$BUILD_LOG$$",
"##redirect_out_err## $$BUILD_SCRIPT$$ $$BUILD_LOG$$",
]
build_command = "\n".join([
"#!/usr/bin/env bash",
convert_shell_script(ctx, build_command_lines),
"",
])
ctx.actions.write(
output = wrapper_script_file,
content = build_command,
is_executable = True,
)
return WrappedOutputs(
script_file = build_script_file,
log_file = build_log_file,
wrapper_script_file = wrapper_script_file,
wrapper_script = build_command,
)
def _declare_output_groups(installdir, outputs):
dict_ = {}
dict_["gen_dir"] = depset([installdir])
for output in outputs:
dict_[output.basename] = [output]
return dict_
def _get_transitive_artifacts(deps):
artifacts = []
for dep in deps:
foreign_dep = get_foreign_cc_dep(dep)
if foreign_dep:
artifacts.append(foreign_dep.artifacts)
return artifacts
def _print_env():
return "\n".join([
"##echo## \"Environment:______________\"",
"##env##",
"##echo## \"__________________________\"",
])
def _correct_path_variable(env):
value = env.get("PATH", "")
if not value:
return env
value = env.get("PATH", "").replace("C:\\", "/c/")
value = value.replace("\\", "/")
value = value.replace(";", ":")
env["PATH"] = "$PATH:" + value
return env
def _depset(item):
if item == None:
return depset()
return depset([item])
def _list(item):
if item:
return [item]
return []
def _copy_deps_and_tools(files):
lines = []
lines += _symlink_contents_to_dir("lib", files.libs)
lines += _symlink_contents_to_dir("include", files.headers + files.include_dirs)
if files.tools_files:
lines.append("##mkdirs## $$EXT_BUILD_DEPS$$/bin")
for tool in files.tools_files:
lines.append("##symlink_to_dir## $$EXT_BUILD_ROOT$$/{} $$EXT_BUILD_DEPS$$/bin/".format(tool))
for ext_dir in files.ext_build_dirs:
lines.append("##symlink_to_dir## $$EXT_BUILD_ROOT$$/{} $$EXT_BUILD_DEPS$$".format(_file_path(ext_dir)))
lines.append("##children_to_path## $$EXT_BUILD_DEPS$$/bin")
lines.append("##path## $$EXT_BUILD_DEPS$$/bin")
return lines
def _symlink_contents_to_dir(dir_name, files_list):
# It is possible that some duplicate libraries will be passed as inputs
# to cmake_external or configure_make. Filter duplicates out here.
files_list = collections.uniq(files_list)
if len(files_list) == 0:
return []
lines = ["##mkdirs## $$EXT_BUILD_DEPS$$/" + dir_name]
for file in files_list:
path = _file_path(file).strip()
if path:
lines.append("##symlink_contents_to_dir## \
$$EXT_BUILD_ROOT$$/{} $$EXT_BUILD_DEPS$$/{}".format(path, dir_name))
return lines
def _file_path(file):
return file if type(file) == "string" else file.path
_FORBIDDEN_FOR_FILENAME = ["\\", "/", ":", "*", "\"", "<", ">", "|"]
def _check_file_name(var):
if (len(var) == 0):
fail("Library name cannot be an empty string.")
for index in range(0, len(var) - 1):
letter = var[index]
if letter in _FORBIDDEN_FOR_FILENAME:
fail("Symbol '%s' is forbidden in library name '%s'." % (letter, var))
# buildifier: disable=name-conventions
_Outputs = provider(
doc = "Provider to keep different kinds of the external build output files and directories",
fields = dict(
out_include_dir = "Directory with header files (relative to install directory)",
out_binary_files = "Binary files, which will be created by the action",
libraries = "Library files, which will be created by the action",
declared_outputs = "All output files and directories of the action",
),
)
def _define_outputs(ctx, attrs, lib_name):
attr_binaries_libs = []
attr_headers_only = attrs.out_headers_only
attr_interface_libs = []
attr_shared_libs = []
attr_static_libs = []
# TODO: Until the the deprecated attributes are removed, we must
# create a mutatable list so we can ensure they're being included
attr_binaries_libs.extend(getattr(attrs, "out_binaries", []))
attr_interface_libs.extend(getattr(attrs, "out_interface_libs", []))
attr_shared_libs.extend(getattr(attrs, "out_shared_libs", []))
attr_static_libs.extend(getattr(attrs, "out_static_libs", []))
# TODO: These names are deprecated, remove
if getattr(attrs, "binaries", []):
# buildifier: disable=print
print("The `binaries` attr is deprecated in favor of `out_binaries`. Please update the target `{}`".format(ctx.label))
attr_binaries_libs.extend(getattr(attrs, "binaries", []))
if getattr(attrs, "headers_only", False):
# buildifier: disable=print
print("The `headers_only` attr is deprecated in favor of `out_headers_only`. Please update the target `{}`".format(ctx.label))
attr_headers_only = attrs.headers_only
if getattr(attrs, "interface_libraries", []):
# buildifier: disable=print
print("The `interface_libraries` attr is deprecated in favor of `out_interface_libs`. Please update the target `{}`".format(ctx.label))
attr_interface_libs.extend(getattr(attrs, "interface_libraries", []))
if getattr(attrs, "shared_libraries", []):
# buildifier: disable=print
print("The `shared_libraries` attr is deprecated in favor of `out_shared_libs`. Please update the target `{}`".format(ctx.label))
attr_shared_libs.extend(getattr(attrs, "shared_libraries", []))
if getattr(attrs, "static_libraries", []):
# buildifier: disable=print
print("The `static_libraries` attr is deprecated in favor of `out_static_libs`. Please update the target `{}`".format(ctx.label))
attr_static_libs.extend(getattr(attrs, "static_libraries", []))
static_libraries = []
if not attr_headers_only:
if not attr_static_libs and not attr_shared_libs and not attr_binaries_libs and not attr_interface_libs:
static_libraries = [lib_name + (".lib" if targets_windows(ctx, None) else ".a")]
else:
static_libraries = attr_static_libs
_check_file_name(lib_name)
out_include_dir = ctx.actions.declare_directory(lib_name + "/" + attrs.out_include_dir)
out_binary_files = _declare_out(ctx, lib_name, attrs.out_bin_dir, attr_binaries_libs)
libraries = LibrariesToLinkInfo(
static_libraries = _declare_out(ctx, lib_name, attrs.out_lib_dir, static_libraries),
shared_libraries = _declare_out(ctx, lib_name, attrs.out_lib_dir, attr_shared_libs),
interface_libraries = _declare_out(ctx, lib_name, attrs.out_lib_dir, attr_interface_libs),
)
declared_outputs = [out_include_dir] + out_binary_files
declared_outputs += libraries.static_libraries
declared_outputs += libraries.shared_libraries + libraries.interface_libraries
return _Outputs(
out_include_dir = out_include_dir,
out_binary_files = out_binary_files,
libraries = libraries,
declared_outputs = declared_outputs,
)
def _declare_out(ctx, lib_name, dir_, files):
if files and len(files) > 0:
return [ctx.actions.declare_file("/".join([lib_name, dir_, file])) for file in files]
return []
# buildifier: disable=name-conventions
InputFiles = provider(
doc = (
"Provider to keep different kinds of input files, directories, " +
"and C/C++ compilation and linking info from dependencies"
),
fields = dict(
headers = "Include files built by Bazel. Will be copied into $EXT_BUILD_DEPS/include.",
include_dirs = (
"Include directories built by Bazel. Will be copied " +
"into $EXT_BUILD_DEPS/include."
),
libs = "Library files built by Bazel. Will be copied into $EXT_BUILD_DEPS/lib.",
tools_files = (
"Files and directories with tools needed for configuration/building " +
"to be copied into the bin folder, which is added to the PATH"
),
ext_build_dirs = (
"Directories with libraries, built by framework function. " +
"This directories should be copied into $EXT_BUILD_DEPS/lib-name as is, with all contents."
),
deps_compilation_info = "Merged CcCompilationInfo from deps attribute",
deps_linking_info = "Merged CcLinkingInfo from deps attribute",
declared_inputs = "All files and directories that must be declared as action inputs",
),
)
def _define_inputs(attrs):
cc_infos = []
bazel_headers = []
bazel_system_includes = []
bazel_libs = []
# This framework function-built libraries: copy result directories under
# $EXT_BUILD_DEPS/lib-name
ext_build_dirs = []
for dep in attrs.deps:
external_deps = get_foreign_cc_dep(dep)
cc_infos.append(dep[CcInfo])
if external_deps:
ext_build_dirs += [artifact.gen_dir for artifact in external_deps.artifacts.to_list()]
else:
headers_info = _get_headers(dep[CcInfo].compilation_context)
bazel_headers += headers_info.headers
bazel_system_includes += headers_info.include_dirs
bazel_libs += _collect_libs(dep[CcInfo].linking_context)
# Keep the order of the transitive foreign dependencies
# (the order is important for the correct linking),
# but filter out repeating directories
ext_build_dirs = uniq_list_keep_order(ext_build_dirs)
tools_roots = []
tools_files = []
input_files = []
for tool in attrs.tools_deps:
tool_root = detect_root(tool)
tools_roots.append(tool_root)
for file_list in tool.files.to_list():
tools_files += _list(file_list)
for tool in attrs.additional_tools:
for file_list in tool.files.to_list():
tools_files += _list(file_list)
for input in attrs.additional_inputs:
for file_list in input.files.to_list():
input_files += _list(file_list)
# These variables are needed for correct C/C++ providers constraction,
# they should contain all libraries and include directories.
cc_info_merged = cc_common.merge_cc_infos(cc_infos = cc_infos)
return InputFiles(
headers = bazel_headers,
include_dirs = bazel_system_includes,
libs = bazel_libs,
tools_files = tools_roots,
deps_compilation_info = cc_info_merged.compilation_context,
deps_linking_info = cc_info_merged.linking_context,
ext_build_dirs = ext_build_dirs,
declared_inputs = filter_containing_dirs_from_inputs(attrs.lib_source.files.to_list()) +
bazel_libs +
tools_files +
input_files +
cc_info_merged.compilation_context.headers.to_list() +
ext_build_dirs,
)
# buildifier: disable=function-docstring
def uniq_list_keep_order(list):
result = []
contains_map = {}
for item in list:
if contains_map.get(item):
continue
contains_map[item] = 1
result.append(item)
return result
def get_foreign_cc_dep(dep):
return dep[ForeignCcDeps] if ForeignCcDeps in dep else None
# consider optimization here to do not iterate both collections
def _get_headers(compilation_info):
include_dirs = compilation_info.system_includes.to_list() + \
compilation_info.includes.to_list()
# do not use quote includes, currently they do not contain
# library-specific information
include_dirs = collections.uniq(include_dirs)
headers = []
for header in compilation_info.headers.to_list():
path = header.path
included = False
for dir_ in include_dirs:
if path.startswith(dir_):
included = True
break
if not included:
headers.append(header)
return struct(
headers = headers,
include_dirs = include_dirs,
)
def _define_out_cc_info(ctx, attrs, inputs, outputs):
compilation_info = cc_common.create_compilation_context(
headers = depset([outputs.out_include_dir]),
system_includes = depset([outputs.out_include_dir.path]),
includes = depset([]),
quote_includes = depset([]),
defines = depset(attrs.defines),
)
linking_info = create_linking_info(ctx, attrs.linkopts, outputs.libraries)
cc_info = CcInfo(
compilation_context = compilation_info,
linking_context = linking_info,
)
inputs_info = CcInfo(
compilation_context = inputs.deps_compilation_info,
linking_context = inputs.deps_linking_info,
)
return cc_common.merge_cc_infos(cc_infos = [cc_info, inputs_info])
def _extract_libraries(library_to_link):
return [
library_to_link.static_library,
library_to_link.pic_static_library,
library_to_link.dynamic_library,
library_to_link.interface_library,
]
def _collect_libs(cc_linking):
libs = []
for li in cc_linking.linker_inputs.to_list():
for library_to_link in li.libraries:
for library in _extract_libraries(library_to_link):
if library:
libs.append(library)
return collections.uniq(libs)
WrappedOutputs = _WrappedOutputs

View File

@ -1,130 +1,8 @@
# buildifier: disable=module-docstring
load("//toolchains/native_tools:tool_access.bzl", "get_make_data")
load(
"//tools/build_defs:cc_toolchain_util.bzl",
"get_flags_info",
"get_tools_info",
)
load(
"//tools/build_defs:detect_root.bzl",
"detect_root",
)
load(
"//tools/build_defs:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load(":configure_script.bzl", "create_make_script")
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
def _make(ctx):
make_data = get_make_data(ctx)
load("//foreign_cc:defs.bzl", _make = "make")
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
tools_deps = ctx.attr.tools_deps + make_data.deps
print_deprecation()
attrs = create_attrs(
ctx.attr,
configure_name = "GNUMake",
create_configure_script = _create_make_script,
tools_deps = tools_deps,
make_path = make_data.path,
make_commands = [],
)
return cc_external_rule_impl(ctx, attrs)
def _create_make_script(configureParameters):
ctx = configureParameters.ctx
inputs = configureParameters.inputs
root = detect_root(ctx.attr.lib_source)
install_prefix = _get_install_prefix(ctx)
tools = get_tools_info(ctx)
flags = get_flags_info(ctx)
make_commands = ctx.attr.make_commands or [
"{make} {keep_going} -C $$EXT_BUILD_ROOT$$/{root}".format(
make = configureParameters.attrs.make_path,
keep_going = "-k" if ctx.attr.keep_going else "",
root = root,
),
"{make} -C $$EXT_BUILD_ROOT$$/{root} install PREFIX={prefix}".format(
make = configureParameters.attrs.make_path,
root = root,
prefix = install_prefix,
),
]
return create_make_script(
workspace_name = ctx.workspace_name,
tools = tools,
flags = flags,
root = root,
user_vars = dict(ctx.attr.make_env_vars),
deps = ctx.attr.deps,
inputs = inputs,
make_commands = make_commands,
prefix = install_prefix,
)
def _get_install_prefix(ctx):
if ctx.attr.prefix:
return ctx.attr.prefix
return "$$INSTALLDIR$$"
def _attrs():
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
attrs.update({
"keep_going": attr.bool(
doc = (
"Keep going when some targets can not be made, -k flag is passed to make " +
"(applies only if make_commands attribute is not set). " +
"Please have a look at _create_make_script for default make_commands."
),
mandatory = False,
default = True,
),
"make_commands": attr.string_list(
doc = (
"Overriding make_commands default value to be empty, " +
"then we can provide better default value programmatically "
),
mandatory = False,
default = [],
),
"make_env_vars": attr.string_dict(
doc = "Environment variables to be set for the 'configure' invocation.",
),
"prefix": attr.string(
doc = (
"Install prefix, an absolute path. " +
"Passed to the GNU make via \"make install PREFIX=<value>\". " +
"By default, the install directory created under sandboxed execution root is used. " +
"Build results are copied to the Bazel's output directory, so the prefix is only important " +
"if it is recorded into any text files by Makefile script. " +
"In that case, it is important to note that rules_foreign_cc is overriding the paths under " +
"execution root with \"BAZEL_GEN_ROOT\" value."
),
mandatory = False,
),
})
return attrs
make = rule(
doc = (
"Rule for building external libraries with GNU Make. " +
"GNU Make commands (make and make install by default) are invoked with prefix=\"install\" " +
"(by default), and other environment variables for compilation and linking, taken from Bazel C/C++ " +
"toolchain and passed dependencies."
),
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _make,
toolchains = [
"@rules_foreign_cc//toolchains:make_toolchain",
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)
make = _make

View File

@ -1,65 +0,0 @@
load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
_DEPRECATION_NOTICE = "This target has moved to `@rules_foreign_cc//toolchains/...`"
alias(
name = "preinstalled_make",
actual = "//toolchains:preinstalled_make",
deprecation = _DEPRECATION_NOTICE,
tags = ["manual"],
visibility = ["//visibility:public"],
)
alias(
name = "make_tool",
actual = "//toolchains:make_tool",
deprecation = _DEPRECATION_NOTICE,
tags = ["manual"],
visibility = ["//visibility:public"],
)
alias(
name = "built_make",
actual = "//toolchains:built_make",
deprecation = _DEPRECATION_NOTICE,
tags = ["manual"],
visibility = ["//visibility:public"],
)
alias(
name = "preinstalled_cmake",
actual = "//toolchains:preinstalled_cmake",
deprecation = _DEPRECATION_NOTICE,
tags = ["manual"],
visibility = ["//visibility:public"],
)
alias(
name = "built_cmake",
actual = "//toolchains:built_cmake",
deprecation = _DEPRECATION_NOTICE,
tags = ["manual"],
visibility = ["//visibility:public"],
)
alias(
name = "preinstalled_ninja",
actual = "//toolchains:preinstalled_ninja",
deprecation = _DEPRECATION_NOTICE,
tags = ["manual"],
visibility = ["//visibility:public"],
)
alias(
name = "built_ninja",
actual = "//toolchains:built_ninja",
deprecation = _DEPRECATION_NOTICE,
tags = ["manual"],
visibility = ["//visibility:public"],
)
bzl_library(
name = "bzl_srcs",
srcs = glob(["**/*.bzl"]),
visibility = ["//visibility:public"],
)

View File

@ -1,9 +0,0 @@
"""This module has been moved to `//toolchains/native_tools:native_tools_toolchains.bzl`.
This file will be removed at some point in the future
"""
load("//toolchains/native_tools:native_tools_toolchain.bzl", _native_tool_toolchain = "native_tool_toolchain")
load("//toolchains/native_tools:tool_access.bzl", _access_tool = "access_tool")
native_tool_toolchain = _native_tool_toolchain
access_tool = _access_tool

View File

@ -1,14 +0,0 @@
"""This module has been moved to `//toolchains/native_tools:native_tools_toolchains.bzl`.
This file will be removed at some point in the future
"""
load(
"//toolchains/native_tools:tool_access.bzl",
_get_cmake_data = "get_cmake_data",
_get_make_data = "get_make_data",
_get_ninja_data = "get_ninja_data",
)
get_cmake_data = _get_cmake_data
get_make_data = _get_make_data
get_ninja_data = _get_ninja_data

View File

@ -1,125 +1,8 @@
"""A module defining the `ninja` rule. A rule for building projects using the Ninja build tool"""
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
load("//toolchains/native_tools:tool_access.bzl", "get_ninja_data")
load(
"//tools/build_defs:detect_root.bzl",
"detect_root",
)
load(
"//tools/build_defs:framework.bzl",
"CC_EXTERNAL_RULE_ATTRIBUTES",
"CC_EXTERNAL_RULE_FRAGMENTS",
"cc_external_rule_impl",
"create_attrs",
)
load("//foreign_cc:defs.bzl", _ninja = "ninja")
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
def _ninja_impl(ctx):
"""The implementation of the `ninja` rule
print_deprecation()
Args:
ctx (ctx): The rule's context object
Returns:
list: A list of providers. See `cc_external_rule_impl`
"""
ninja_data = get_ninja_data(ctx)
tools_deps = ctx.attr.tools_deps + ninja_data.deps
attrs = create_attrs(
ctx.attr,
configure_name = "Ninja",
create_configure_script = _create_ninja_script,
tools_deps = tools_deps,
ninja_path = ninja_data.path,
make_commands = [],
)
return cc_external_rule_impl(ctx, attrs)
def _create_ninja_script(configureParameters):
"""Creates the bash commands for invoking commands to build ninja projects
Args:
configureParameters (struct): See `ConfigureParameters`
Returns:
str: A string representing a section of a bash script
"""
ctx = configureParameters.ctx
script = []
root = detect_root(ctx.attr.lib_source)
script.append("##symlink_contents_to_dir## $$EXT_BUILD_ROOT$$/{} $$BUILD_TMPDIR$$".format(root))
data = ctx.attr.data or list()
# Generate a list of arguments for ninja
args = " ".join([
ctx.expand_location(arg, data)
for arg in ctx.attr.args
])
# Set the directory location for the build commands
directory = "$$EXT_BUILD_ROOT$$/{}".format(root)
if ctx.attr.directory:
directory = ctx.expand_location(ctx.attr.directory, data)
# Generate commands for all the targets, ensuring there's
# always at least 1 call to the default target.
for target in ctx.attr.targets or [""]:
# Note that even though directory is always passed, the
# following arguments can take precedence.
script.append("{ninja} -C {dir} {args} {target}".format(
ninja = configureParameters.attrs.ninja_path,
dir = directory,
args = args,
target = target,
))
return "\n".join(script)
def _attrs():
"""Modifies the common set of attributes used by rules_foreign_cc and sets Ninja specific attrs
Returns:
dict: Attributes of the `ninja` rule
"""
attrs = dict(CC_EXTERNAL_RULE_ATTRIBUTES)
# Drop old vars
attrs.pop("make_commands")
attrs.update({
"args": attr.string_list(
doc = "A list of arguments to pass to the call to `ninja`",
),
"directory": attr.string(
doc = (
"A directory to pass as the `-C` argument. The rule will always use the root " +
"directory of the `lib_sources` attribute if this attribute is not set"
),
),
"targets": attr.string_list(
doc = (
"A list of ninja targets to build. To call the default target, simply pass `\"\"` as " +
"one of the items to this attribute."
),
),
})
return attrs
ninja = rule(
doc = (
"Rule for building external libraries with [Ninja](https://ninja-build.org/)."
),
attrs = _attrs(),
fragments = CC_EXTERNAL_RULE_FRAGMENTS,
output_to_genfiles = True,
implementation = _ninja_impl,
toolchains = [
"@rules_foreign_cc//toolchains:ninja_toolchain",
"@rules_foreign_cc//tools/build_defs/shell_toolchain/toolchains:shell_commands",
"@bazel_tools//tools/cpp:toolchain_type",
],
)
ninja = _ninja

View File

@ -1,48 +1,17 @@
# buildifier: disable=module-docstring
# buildifier: disable=name-conventions
CreatedByScript = provider(
doc = "Structure to keep declared file or directory and creating script.",
fields = dict(
file = "Declared file or directory",
script = "Script that creates that file or directory",
),
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
# buildifier: disable=bzl-visibility
load(
"//foreign_cc/private:run_shell_file_utils.bzl",
_CreatedByScript = "CreatedByScript",
_copy_directory = "copy_directory",
_fictive_file_in_genroot = "fictive_file_in_genroot",
)
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
def fictive_file_in_genroot(actions, target_name):
"""Creates a fictive file under the build root.
print_deprecation()
This gives the possibility to address the build root in script and construct paths under it.
Args:
actions (ctx.actions): actions factory
target_name (ctx.label.name): name of the current target
"""
# we need this fictive file in the genroot to get the path of the root in the script
empty = actions.declare_file("empty_{}.txt".format(target_name))
return CreatedByScript(
file = empty,
script = "##touch## $$EXT_BUILD_ROOT$$/" + empty.path,
)
def copy_directory(actions, orig_path, copy_path):
"""Copies directory by $EXT_BUILD_ROOT/orig_path into to $EXT_BUILD_ROOT/copy_path.
I.e. a copy of the directory is created under $EXT_BUILD_ROOT/copy_path.
Args:
actions: actions factory (ctx.actions)
orig_path: path to the original directory, relative to the build root
copy_path: target directory, relative to the build root
"""
dir_copy = actions.declare_directory(copy_path)
return CreatedByScript(
file = dir_copy,
script = "\n".join([
"##mkdirs## $$EXT_BUILD_ROOT$$/" + dir_copy.path,
"##copy_dir_contents_to_dir## {} $$EXT_BUILD_ROOT$$/{}".format(
orig_path,
dir_copy.path,
),
]),
)
# buildifier: disable=name-conventions
CreatedByScript = _CreatedByScript
fictive_file_in_genroot = _fictive_file_in_genroot
copy_directory = _copy_directory

View File

@ -1,197 +1,30 @@
"""Contains functions for conversion from intermediate multiplatform notation for defining
the shell script into the actual shell script for the concrete platform.
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
Notation:
# buildifier: disable=bzl-visibility
load(
"//foreign_cc/private:shell_script_helper.bzl",
_convert_shell_script = "convert_shell_script",
_convert_shell_script_by_context = "convert_shell_script_by_context",
_create_function = "create_function",
_do_function_call = "do_function_call",
_extract_wrapped = "extract_wrapped",
_get_function_name = "get_function_name",
_os_name = "os_name",
_replace_exports = "replace_exports",
_replace_var_ref = "replace_var_ref",
_split_arguments = "split_arguments",
)
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
1. `export <varname>=<value>`
Define the environment variable with the name <varname> and value <value>.
If the <value> contains the toolchain command call (see 3), the call is replaced with needed value.
print_deprecation()
2. `$$<varname>$$`
Refer the environment variable with the name <varname>,
i.e. this will become $<varname> on Linux/MacOS, and %<varname>% on Windows.
3. `##<funname>## <arg1> ... <argn>`
Find the shell toolchain command Starlark method with the name <funname> for that command
in a toolchain, and call it, passing <arg1> .. <argn>.
(see ./shell_toolchain/commands.bzl, ./shell_toolchain/impl/linux_commands.bzl etc.)
The arguments are space-separated; if the argument is quoted, the spaces inside the quites are
ignored.
! Escaping of the quotes inside the quoted argument is not supported, as it was not needed for now.
(quoted arguments are used for paths and never for any arbitrary string.)
The call of a shell toolchain Starlark method is performed through
//tools/build_defs/shell_toolchain/toolchains:access.bzl; please refer there for the details.
Here what is important is that the Starlark method can also add some text (function definitions)
into a "prelude" part of the shell_context.
The resulting script is constructed from the prelude part with function definitions and
the actual translated script part.
Since function definitions can call other functions, we perform the fictive translation
of the function bodies to populate the "prelude" part of the script.
"""
load("//tools/build_defs/shell_toolchain/toolchains:access.bzl", "call_shell", "create_context")
load("//tools/build_defs/shell_toolchain/toolchains:commands.bzl", "PLATFORM_COMMANDS")
def os_name(ctx):
return call_shell(create_context(ctx), "os_name")
def create_function(ctx, name, text):
return call_shell(create_context(ctx), "define_function", name, text)
def convert_shell_script(ctx, script):
""" Converts shell script from the intermediate notation to actual schell script.
Please see the file header for the notation description.
Args:
ctx: rule context
script: the array of script strings, each string can be of multiple lines
Returns:
the string with the shell script for the current execution platform
"""
return convert_shell_script_by_context(create_context(ctx), script)
# buildifier: disable=function-docstring
def convert_shell_script_by_context(shell_context, script):
# 0. Split in lines merged fragments.
new_script = []
for fragment in script:
new_script += fragment.splitlines()
script = new_script
# 1. Call the functions or replace export statements.
script = [do_function_call(line, shell_context) for line in script]
# 2. Make sure functions calls are replaced.
# (it is known there is no deep recursion, do it only once)
script = [do_function_call(line, shell_context) for line in script]
# 3. Same for function bodies.
#
# Since we have some function bodies containing calls to other functions,
# we need to replace calls to the new functions and add the text
# of those functions to shell_context.prelude several times,
# and 4 times is enough for our toolchain.
# Example of such function: 'symlink_contents_to_dir'.
processed_prelude = {}
for i in range(1, 4):
for key in shell_context.prelude.keys():
text = shell_context.prelude[key]
lines = text.splitlines()
replaced = "\n".join([
do_function_call(line.strip(" "), shell_context)
for line in lines
])
processed_prelude[key] = replaced
for key in processed_prelude.keys():
shell_context.prelude[key] = processed_prelude[key]
script = shell_context.prelude.values() + script
# 4. replace all variable references
script = [replace_var_ref(line, shell_context) for line in script]
result = "\n".join(script)
return result
# buildifier: disable=function-docstring
def replace_var_ref(text, shell_context):
parts = []
current = text
# long enough
for i in range(1, 100):
(before, varname, after) = extract_wrapped(current, "$$")
if not varname:
parts.append(current)
break
parts.append(before)
parts.append(shell_context.shell.use_var(varname))
current = after
return "".join(parts)
# buildifier: disable=function-docstring
def replace_exports(text, shell_context):
text = text.strip(" ")
(varname, separator, value) = text.partition("=")
if not separator:
fail("Wrong export declaration")
(funname, after) = get_function_name(value.strip(" "))
if funname:
value = call_shell(shell_context, funname, *split_arguments(after.strip(" ")))
return call_shell(shell_context, "export_var", varname, value)
# buildifier: disable=function-docstring
def get_function_name(text):
(funname, separator, after) = text.partition(" ")
if funname == "export":
return (funname, after)
(before, funname_extracted, after_extracted) = extract_wrapped(funname, "##", "##")
if funname_extracted and PLATFORM_COMMANDS.get(funname_extracted):
if len(before) > 0 or len(after_extracted) > 0:
fail("Something wrong with the shell command call notation: " + text)
return (funname_extracted, after)
return (None, None)
# buildifier: disable=function-docstring
def extract_wrapped(text, prefix, postfix = None):
postfix = postfix or prefix
(before, separator, after) = text.partition(prefix)
if not separator or not after:
return (text, None, None)
(varname, separator2, after2) = after.partition(postfix)
if not separator2:
fail("Variable or function name is not marked correctly in fragment: {}".format(text))
return (before, varname, after2)
# buildifier: disable=function-docstring
def do_function_call(text, shell_context):
(funname, after) = get_function_name(text.strip(" "))
if not funname:
return text
if funname == "export":
return replace_exports(after, shell_context)
arguments = split_arguments(after.strip(" ")) if after else []
return call_shell(shell_context, funname, *arguments)
# buildifier: disable=function-docstring
def split_arguments(text):
parts = []
current = text.strip(" ")
# long enough
for i in range(1, 100):
if not current:
break
# we are ignoring escaped quotes
(before, separator, after) = current.partition("\"")
if not separator:
parts += current.split(" ")
break
(quoted, separator2, after2) = after.partition("\"")
if not separator2:
fail("Incorrect quoting in fragment: {}".format(current))
before = before.strip(" ")
if before:
parts += before.split(" ")
parts.append("\"" + quoted + "\"")
current = after2
return parts
os_name = _os_name
create_function = _create_function
convert_shell_script = _convert_shell_script
convert_shell_script_by_context = _convert_shell_script_by_context
replace_var_ref = _replace_var_ref
replace_exports = _replace_exports
get_function_name = _get_function_name
extract_wrapped = _extract_wrapped
do_function_call = _do_function_call
split_arguments = _split_arguments

View File

@ -1,88 +1,12 @@
# buildifier: disable=module-docstring
def _provider_text(symbols):
return """
WRAPPER = provider(
doc = "Wrapper to hold imported methods",
fields = [{}]
"""DEPRECATED: Please use the sources in `@rules_foreign_cc//foreign_cc/...`"""
# buildifier: disable=bzl-visibility
load(
"//foreign_cc/private/shell_toolchain/polymorphism:generate_overloads.bzl",
_generate_overloads = "generate_overloads",
)
""".format(", ".join(["\"%s\"" % symbol_ for symbol_ in symbols]))
load("//tools/build_defs:deprecation.bzl", "print_deprecation")
def _getter_text():
return """
def id_from_file(file_name):
(before, middle, after) = file_name.partition(".")
return before
print_deprecation()
def get(file_name):
id = id_from_file(file_name)
return WRAPPER(**_MAPPING[id])
"""
def _mapping_text(ids):
data_ = []
for id in ids:
data_.append("{id} = wrapper_{id}".format(id = id))
return "_MAPPING = dict(\n{data}\n)".format(data = ",\n".join(data_))
def _load_and_wrapper_text(id, file_path, symbols):
load_list = ", ".join(["{id}_{symbol} = \"{symbol}\"".format(id = id, symbol = symbol_) for symbol_ in symbols])
load_statement = "load(\":{file}\", {list})".format(file = file_path, list = load_list)
data = ", ".join(["{symbol} = {id}_{symbol}".format(id = id, symbol = symbol_) for symbol_ in symbols])
wrapper_statement = "wrapper_{id} = dict({data})".format(id = id, data = data)
return struct(
load_ = load_statement,
wrapper = wrapper_statement,
)
def id_from_file(file_name):
(before, middle, after) = file_name.partition(".")
return before
def get_file_name(file_label):
(before, separator, after) = file_label.partition(":")
return id_from_file(after)
def _copy_file(rctx, src):
src_path = rctx.path(src)
copy_path = src_path.basename
rctx.template(copy_path, src_path)
return copy_path
_BUILD_FILE = """\
exports_files(
[
"toolchain_data_defs.bzl",
],
visibility = ["//visibility:public"],
)
"""
def _generate_overloads(rctx):
symbols = rctx.attr.symbols
ids = []
lines = ["# Generated overload mappings"]
loads = []
wrappers = []
for file_ in rctx.attr.files:
id = id_from_file(file_.name)
ids.append(id)
copy = _copy_file(rctx, file_)
load_and_wrapper = _load_and_wrapper_text(id, copy, symbols)
loads.append(load_and_wrapper.load_)
wrappers.append(load_and_wrapper.wrapper)
lines += loads
lines += wrappers
lines.append(_mapping_text(ids))
lines.append(_provider_text(symbols))
lines.append(_getter_text())
rctx.file("toolchain_data_defs.bzl", "\n".join(lines))
rctx.file("BUILD", _BUILD_FILE)
generate_overloads = repository_rule(
implementation = _generate_overloads,
attrs = {
"files": attr.label_list(),
"symbols": attr.string_list(),
},
)
generate_overloads = _generate_overloads

View File

@ -1,20 +1,17 @@
load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
load(":defs.bzl", "build_part")
toolchain_type(
alias(
name = "shell_commands",
actual = "//foreign_cc/private/shell_toolchain/toolchains:shell_commands",
deprecation = "This target has been moved to `@rules_foreign_cc//foreign_cc/private/shell_toolchain/toolchains:shell_commands`",
visibility = ["//visibility:public"],
)
build_part(
toolchain_type = ":shell_commands",
)
bzl_library(
name = "bzl_srcs",
srcs = glob(["**/*.bzl"]) + [
"@commands_overloads//:toolchain_data_defs.bzl",
"@rules_foreign_cc_commands_overloads//:toolchain_data_defs.bzl",
],
visibility = ["//:__subpackages__"],
deps = ["//tools/build_defs/shell_toolchain/toolchains/impl:bzl_srcs"],
deps = ["//foreign_cc/private/shell_toolchain/toolchains/impl:bzl_srcs"],
)

Some files were not shown because too many files have changed in this diff Show More