Attribute based matching in dependency resolution
This chapter is primarily aimed towards plugin authors who want to understand better how to leverage the capabilities of the dependency resolution engine to support variant-aware dependency management. Users who simply want to understand what configuration attributes are will also find support here.
Different kinds of configurations
Historically, configurations have been at the root of dependency resolution in Gradle. In the end, what we want to make a difference is between a consumer and a producer. For this purpose, configurations are used for at least 3 different aspects:
-
to declare dependencies
-
as a consumer, to resolve a set of dependencies to files
-
as a producer, to expose artifacts for consumption by other projects
For example, if I want to express that my application app
depends on library lib
, we need at least one configuration:
configurations {
// declare a "configuration" named "someConfiguration"
someConfiguration
}
dependencies {
// add a project dependency to the "someConfiguration" configuration
someConfiguration project(":lib")
}
// declare a "configuration" named "someConfiguration"
val someConfiguration by configurations.creating
dependencies {
// add a project dependency to the "someConfiguration" configuration
someConfiguration(project(":lib"))
}
Configurations can extend other configuration, in order to inherit their dependencies.
However, the code above doesn’t tell anything about the consumer.
In particular, it doesn’t tell what is the use of the configuration.
Let’s say that lib
is a Java library: it can expose different things, such as its API, implementation or test fixtures.
If we want to resolve the dependencies of app
, we need to know what kind of task we’re performing (compiling against the API of lib
, executing the application, compiling tests, …).
For this purpose, you’ll often find companion configurations, which are meant to unambiguously declare the usage:
configurations {
// declare a configuration that is going to resolve the compile classpath of the application
compileClasspath.extendsFrom(someConfiguration)
// declare a configuration that is going to resolve the runtime classpath of the application
runtimeClasspath.extendsFrom(someConfiguration)
}
configurations {
// declare a configuration that is going to resolve the compile classpath of the application
compileClasspath.extendsFrom(someConfiguration)
// declare a configuration that is going to resolve the runtime classpath of the application
runtimeClasspath.extendsFrom(someConfiguration)
}
At this stage, we have 3 different configurations, which already have different goals:
-
someConfiguration
declares the dependencies of my application. It’s just a bucket where we declare a list of dependencies. -
compileClasspath
andruntimeClasspath
are configurations meant to be resolved: when resolved they should contain respectively the compile classpath, and the runtime classpath of the application.
This is actually represented on the Configuration
type by the canBeResolved
flag.
A configuration that can be resolved is a configuration for which we can compute a dependency graph, because it contains all the necessary information for resolution to happen.
That is to say we’re going to compute a dependency graph, resolve the components in the graph, and eventually get artifacts.
A configuration which has canBeResolved
set to false
is not meant to be resolved.
Such a configuration is there only to declare dependencies.
The reason is that depending on the usage (compile classpath, runtime classpath), it can resolve to different graphs.
It is an error to try to resolve a configuration which has canBeResolved
set to false
.
To some extent, this is similar to an abstract class (canBeResolved
=false) which is not supposed to be instantiated, and a concrete class extending the abstract class (canBeResolved
=true).
A resolvable configuration will extend at least one non resolvable configuration (and may extend more than one).
On the other end, at the library project side (the producer), we also use configurations to represent what can be consumed.
For example, the library may expose an API or a runtime, and we would attach artifacts to either one, the other, or both.
Typically, to compile against lib
, we need the API of lib
, but we don’t need its runtime dependencies.
So the lib
project will expose an apiElements
configuration, which is aimed for consumers looking for its API.
Such a configuration is going to be consumable, but is not meant to be resolved.
This is expressed via the canBeConsumed flag of a Configuration
:
configurations {
// A configuration meant for consumers that need the API of this component
exposedApi {
// This configuration is an "outgoing" configuration, it's not meant to be resolved
canBeResolved = false
// As an outgoing configuration, explain that consumers may want to consume it
canBeConsumed = true
}
// A configuration meant for consumers that need the implementation of this component
exposedRuntime {
canBeResolved = false
canBeConsumed = true
}
}
configurations {
// A configuration meant for consumers that need the API of this component
create("exposedApi") {
// This configuration is an "outgoing" configuration, it's not meant to be resolved
isCanBeResolved = false
// As an outgoing configuration, explain that consumers may want to consume it
isCanBeConsumed = true
}
// A configuration meant for consumers that need the implementation of this component
create("exposedRuntime") {
isCanBeResolved = false
isCanBeConsumed = true
}
}
In short, a configuration role is determined by the canBeResolved
and canBeConsumed
flag combinations:
Configuration role | can be resolved | can be consumed |
---|---|---|
Bucket of dependencies |
false |
false |
Resolve for certain usage |
true |
false |
Exposed to consumers |
false |
true |
Legacy, don’t use |
true |
true |
For backwards compatibility, those flags have both true
as the default value, but as a plugin author, you should always determine the right values for those flags, or you might accidentally introduce resolution errors.
Configuration attributes
We have explained that we have 3 configuration roles, and explained that we may want to resolve the compile and runtime classpath differently, but there’s nothing in what we’ve written which allows explaining the difference.
This is where attributes come into play.
The role of attributes is to perform the selection of the right variant of a component.
In our example, the lib
library exposes 2 variants: its API (via exposedApi
) and its runtime (via exposedRuntime
).
There’s no restriction on the number of variants a component can expose.
We may, for example, want to expose the test fixtures of a component too.
But then, the consumer needs to explain what configuration to consume, and this is done by setting attributes on both the consumer and producer ends.
Attributes consist of a name and a value pair.
Gradle comes with standard attributes named org.gradle.usage
, org.gradle.category
and org.gradle.libraryelements
specifically to deal with the concept of selecting the right variant of a component based on the usage of the consumer (compile, runtime …).
It is however possible to define an arbitrary number of attributes.
As a producer, I can express that a consumable configuration represents the API of a component by attaching the org.gradle.usage=java-api
attribute to the configuration.
As a consumer, I can express that I need the API of the dependencies of a resolvable configuration by attaching the org.gradle.usage=java-api
attribute to it.
Now Gradle has a way to automatically select the appropriate variant by looking at the configuration attributes:
-
the consumer wants
org.gradle.usage=java-api
-
the dependent project exposes 2 different variants. One with
org.gradle.usage=java-api
, the other withorg.gradle.usage=java-runtime
. -
Gradle selects the
org.gradle.usage=java-api
variant
In other words: attributes are used to perform the selection based on the values of the attributes. It doesn’t matter what the names of the configurations are: only the attributes matter.
Declaring attributes
Attributes are typed. An attribute can be created via the Attribute<T>.of
method:
// An attribute of type `String`
def myAttribute = Attribute.of("my.attribute.name", String)
// An attribute of type `Usage`
def myUsage = Attribute.of("my.usage.attribute", Usage)
// An attribute of type `String`
val myAttribute = Attribute.of("my.attribute.name", String::class.java)
// An attribute of type `Usage`
val myUsage = Attribute.of("my.usage.attribute", Usage::class.java)
Currently, only attribute types of String
, or anything extending Named
is supported.
Attributes must be declared in the attribute schema found on the dependencies
handler:
dependencies.attributesSchema {
// registers this attribute to the attributes schema
attribute(myAttribute)
attribute(myUsage)
}
dependencies.attributesSchema {
// registers this attribute to the attributes schema
attribute(myAttribute)
attribute(myUsage)
}
Then configurations can be configured to set values for attributes:
configurations {
myConfiguration {
attributes {
attribute(myAttribute, 'my-value')
}
}
}
configurations {
create("myConfiguration") {
attributes {
attribute(myAttribute, "my-value")
}
}
}
For attributes which type extends Named
, the value of the attribute must be created via the object factory:
configurations {
myConfiguration {
attributes {
attribute(myUsage, project.objects.named(Usage, 'my-value'))
}
}
}
configurations {
"myConfiguration" {
attributes {
attribute(myUsage, project.objects.named(Usage::class.java, "my-value"))
}
}
}
Attribute compatibility rules
Attributes let the engine select compatible variants. However, there are cases where a provider may not have exactly what the consumer wants, but still something that it can use. For example, if the consumer is asking for the API of a library, there’s a possibility that the producer doesn’t have such a variant, but only a runtime variant. This is typical of libraries published on external repositories. In this case, we know that even if we don’t have an exact match (API), we can still compile against the runtime variant (it contains more than what we need to compile but it’s still ok to use). To deal with this, Gradle provides attribute compatibility rules. The role of a compatibility rule is to explain what variants are compatible with what the consumer asked for.
Attribute compatibility rules have to be registered via the attribute matching strategy that you can obtain from the attributes schema.
Attribute disambiguation rules
Because multiple values for an attribute can be compatible with the requested attribute, Gradle needs to choose between the candidates. This is done by implementing an attribute disambiguation rule.
Attribute disambiguation rules have to be registered via the attribute matching strategy that you can obtain from the attributes schema.
Transforming dependency artifacts on resolution
As described in different kinds of configurations, there may be different variants for the same dependency.
For example, an external Maven dependency has a variant which should be used when compiling against the dependency (java-api
), and a variant for running an application which uses the dependency (java-runtime
).
A project dependency has even more variants, for example the classes of the project which are used for compilation are available as classes directories (org.gradle.usage=java-api, org.gradle.libraryelements=classes
) or as JARs (org.gradle.usage=java-api, org.gradle.libraryelements=jar
).
The variants of a dependency may differ in its transitive dependencies or in the artifact itself.
For example, the java-api
and java-runtime
variants of a Maven dependency only differ in the transitive dependencies and both use the same artifact - the JAR file.
For a project dependency, the java-api,classes
and the java-api,jars
variants have the same transitive dependencies and different artifacts - the classes directories and the JAR files respectively.
Gradle identifies a variant of a dependency uniquely by its set of attributes.
The java-api
variant of a dependency is the variant identified by the org.gradle.usage
attribute with value java-api
.
When Gradle resolves a configuration, the attributes on the resolved configuration determine the requested attributes.
For all dependencies in the configuration, the variant with the requested attributes is selected when resolving the configuration.
For example, when the configuration requests org.gradle.usage=java-api, org.gradle.libraryelements=classes
on a project dependency, then the classes directory is selected as the artifact.
When the dependency does not have a variant with the requested attributes, resolving the configuration fails.
Sometimes it is possible to transform the artifact of the dependency into the requested variant without changing the transitive dependencies.
For example, unzipping a JAR transforms the artifact of the java-api,jars
variant into the java-api,classes
variant.
Such a transformation is called Artifact Transform.
Gradle allows registering artifact transforms, and when the dependency does not have the requested variant, then Gradle will try to find a chain of artifact transforms for creating the variant.
Artifact transform selection and execution
As described above, when Gradle resolves a configuration and a dependency in the configuration does not have a variant with the requested attributes, Gradle tries to find a chain of artifact transforms to create the variant.
The process of finding a matching chain of artifact transforms is called artifact transform selection.
Each registered transform converts from a set of attributes to a set of attributes.
For example, the unzip transform can convert from org.gradle.usage=java-api, org.gradle.libraryelements=jars
to org.gradle.usage=java-api, org.gradle.libraryelements=classes
.
In order to find a chain, Gradle starts with the requested attributes and then considers all transforms which modify some of the requested attributes as possible paths leading there. Going backwards, Gradle tries to obtain a path to some existing variant using transforms.
For example, consider a minified
attribute with two values: true
and false
.
The minified attribute represents a variant of a dependency with unnecessary class files removed.
There is an artifact transform registered, which can transform minified
from false
to true
.
When minified=true
is requested for a dependency, and there are only variants with minified=false
, then Gradle selects the registered minify transform.
The minify transform is able to transform the artifact of the dependency with minified=false
to the artifact with minified=true
.
Of all the found transform chains, Gradle tries to select the best one:
-
If there is only one transform chain, it is selected.
-
If there are two transform chains, and one is a suffix of the other one, it is selected.
-
If there is a shortest transform chain, then it is selected.
-
In all other cases, the selection fails and an error is reported.
✨
|
Gradle does not try to select artifact transforms when there is already a variant of the dependency matching the requested attributes. |
✨
|
The |
After selecting the required artifact transforms, Gradle resolves the variants of the dependencies which are necessary for the initial transform in the chain. As soon as Gradle finishes resolving the artifacts for the variant, either by downloading an external dependency or executing a task producing the artifact, Gradle starts transforming the artifacts of the variant with the selected chain of artifact transforms. Gradle executes the transform chains in parallel when possible.
Picking up the minify example above, consider a configuration with two dependencies, the external guava
dependency and a project dependency on the producer
project.
The configuration has the attributes org.gradle.usage=java-runtime,org.gradle.libraryelements=jar,minified=true
.
The external guava
dependency has two variants:
-
org.gradle.usage=java-runtime,org.gradle.libraryelements=jar,minified=false
and -
org.gradle.usage=java-api,org.gradle.libraryelements=jar,minified=false
.
Using the minify transform, Gradle can convert the variant org.gradle.usage=java-runtime,org.gradle.libraryelements=jar,minified=false
of guava
to org.gradle.usage=java-runtime,org.gradle.libraryelements=jar,minified=true
, which are the requested attributes.
The project dependency also has variants:
-
org.gradle.usage=java-runtime,org.gradle.libraryelements=jar,minified=false
, -
org.gradle.usage=java-runtime,org.gradle.libraryelements=classes,minified=false
, -
org.gradle.usage=java-api,org.gradle.libraryelements=jar,minified=false
, -
org.gradle.usage=java-api,org.gradle.libraryelements=classes,minified=false
-
and a few more.
Again, using the minify transform, Gradle can convert the variant org.gradle.usage=java-runtime,org.gradle.libraryelements=jar,minified=false
of the project producer
to org.gradle.usage=java-runtime,org.gradle.libraryelements=jar,minified=true
, which are the requested attributes.
When the configuration is resolved, Gradle needs to download the guava
JAR and minify it.
Gradle also needs to execute the producer:jar
task to generate the JAR artifact of the project and then minify it.
The downloading and the minification of the guava.jar
happens in parallel to the execution of the producer:jar
task and the minification of the resulting JAR.
Here is how to setup the minified
attribute so that the above works.
You need to register the new attribute in the schema, add it to all JAR artifacts and request it on all resolvable configurations.
def artifactType = Attribute.of('artifactType', String)
def minified = Attribute.of('minified', Boolean)
dependencies {
attributesSchema {
attribute(minified) // (1)
}
artifactTypes.getByName("jar") {
attributes.attribute(minified, false) // (2)
}
}
configurations.all {
afterEvaluate {
if (canBeResolved) {
attributes.attribute(minified, true) // (3)
}
}
}
dependencies {
registerTransform(Minify) {
from.attribute(minified, false).attribute(artifactType, "jar")
to.attribute(minified, true).attribute(artifactType, "jar")
}
}
dependencies { // (4)
implementation('com.google.guava:guava:27.1-jre')
implementation(project(':producer'))
}
val artifactType = Attribute.of("artifactType", String::class.java)
val minified = Attribute.of("minified", Boolean::class.javaObjectType)
dependencies {
attributesSchema {
attribute(minified) // (1)
}
artifactTypes.getByName("jar") {
attributes.attribute(minified, false) // (2)
}
}
configurations.all {
afterEvaluate {
if (isCanBeResolved) {
attributes.attribute(minified, true) // (3)
}
}
}
dependencies {
registerTransform(Minify::class) {
from.attribute(minified, false).attribute(artifactType, "jar")
to.attribute(minified, true).attribute(artifactType, "jar")
}
}
dependencies { // (4)
implementation("com.google.guava:guava:27.1-jre")
implementation(project(":producer"))
}
-
Add the attribute to the schema
-
All JAR files are not minified
-
Request
minified=true
on all resolvable configurations -
Add the dependencies which will be transformed
You can now see what happens when we run the resolveRuntimeClasspath
task which resolves the runtimeClasspath
configuration.
Observe that Gradle transforms the project dependency before the resolveRuntimeClasspath
task starts.
Gradle transforms the binary dependencies when it executes the resolveRuntimeClasspath
task.
> gradle resolveRuntimeClasspath > Task :producer:compileJava > Task :producer:processResources NO-SOURCE > Task :producer:classes > Task :producer:jar > Transform artifact producer.jar (project :producer) with Minify Nothing to minify - using producer.jar unchanged > Task :resolveRuntimeClasspath Minifying guava-27.1-jre.jar Nothing to minify - using listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar unchanged Nothing to minify - using jsr305-3.0.2.jar unchanged Nothing to minify - using checker-qual-2.5.2.jar unchanged Nothing to minify - using error_prone_annotations-2.2.0.jar unchanged Nothing to minify - using j2objc-annotations-1.1.jar unchanged Nothing to minify - using animal-sniffer-annotations-1.17.jar unchanged Nothing to minify - using failureaccess-1.0.1.jar unchanged BUILD SUCCESSFUL in 0s 3 actionable tasks: 3 executed
Implementing artifact transforms
Similar to task types, an artifact transform consists of an action and some parameters. The major difference to custom task types is that the action and the parameters are implemented as two separate classes.
The implementation of the artifact transform action is a class implementing TransformAction.
You need to implement the transform()
method on the action, which converts an input artifact into zero, one or multiple of output artifacts.
Most artifact transforms will be one-to-one, so the transform method will transform the input artifact to exactly one output artifact.
The implementation of the artifact transform action needs to register each output artifact by calling TransformOutputs.dir() or TransformOutputs.file().
You can only supply two types of paths to the dir
or file
methods:
-
An absolute path to the input artifact or in the input artifact (for an input directory).
-
A relative path.
Gradle uses the absolute path as the location of the output artifact.
For example, if the input artifact is an exploded WAR, then the transform action can call TransformOutputs.file()
for all jar files in the WEB-INF/lib
directory.
The output of the transform would then be the library JARs of the web application.
For a relative path, the dir()
or file()
method returns a workspace to the transform action.
The implementation of the transform action needs to create the transformed artifact at the location of the provided workspace.
The output artifacts replace the input artifact in the transformed variant in the order they were registered.
For example, if the configuration consists of the artifacts lib1.jar
, lib2.jar
, lib3.jar
, and the transform action registers a minified output artifact <artifact-name>-min.jar
for the input artifact, then the transformed configuration consists of the artifacts lib1-min.jar
, lib2-min.jar
and lib3-min.jar
.
Here is the implementation of an Unzip
transform which transforms a JAR file into a classes directory by unzipping it.
The Unzip
transform does not require any parameters.
Note how the implementation uses @InputArtifact
to inject the artifact to transform into the action.
It requests a directory for the unzipped classes by using TransformOutputs.dir()
and then unzips the JAR file into this directory.
abstract class Unzip implements TransformAction<TransformParameters.None> { // (1)
@InputArtifact // (2)
abstract Provider<FileSystemLocation> getInputArtifact()
@Override
void transform(TransformOutputs outputs) {
def input = inputArtifact.get().asFile
def unzipDir = outputs.dir(input.name) // (3)
unzipTo(input, unzipDir) // (4)
}
private static void unzipTo(File zipFile, File unzipDir) {
// implementation...
}
}
abstract class Unzip : TransformAction<TransformParameters.None> { // (1)
@get:InputArtifact // (2)
abstract val inputArtifact: Provider<FileSystemLocation>
override
fun transform(outputs: TransformOutputs) {
val input = inputArtifact.get().asFile
val unzipDir = outputs.dir(input.name) // (3)
unzipTo(input, unzipDir) // (4)
}
private fun unzipTo(zipFile: File, unzipDir: File) {
// implementation...
}
}
-
Use
TransformParameters.None
if the transform does not use parameters -
Inject the input artifact
-
Request an output location for the unzipped files
-
Do the actual work of the transform
An artifact transform may require parameters, like a String
determining some filter, or some file collection which is used for supporting the transformation of the input artifact.
In order to pass those parameters to the transform action, you need to define a new type with the desired parameters.
The type needs to implement the marker interface TransformParameters.
The parameters must be represented using managed properties and the parameters type must be a managed type.
You can use an interface declaring the getters and Gradle will generate the implementation.
All getters need to have proper input annotations, see the table in the section on incremental build.
You can find out more about implementing artifact transform parameters in Developing Custom Gradle Types.
Here is the implementation of a Minify
transform that makes JARs smaller by only keeping certain classes in them.
The Minify
transform requires the classes to keep as parameters.
Observe how you can obtain the parameters by TransformAction.getParameters()
in the transform()
method.
The implementation of the transform()
method requests a location for the minified JAR by using TransformOutputs.file()
and then creates the minified JAR at this location.
abstract class Minify implements TransformAction<Parameters> { // (1)
interface Parameters extends TransformParameters { // (2)
@Input
Map<String, Set<String>> getKeepClassesByArtifact()
void setKeepClassesByArtifact(Map<String, Set<String>> keepClasses)
}
@PathSensitive(PathSensitivity.NAME_ONLY)
@InputArtifact
abstract Provider<FileSystemLocation> getInputArtifact()
@Override
void transform(TransformOutputs outputs) {
def fileName = inputArtifact.get().asFile.name
for (entry in parameters.keepClassesByArtifact) { // (3)
if (fileName.startsWith(entry.key)) {
def nameWithoutExtension = fileName.substring(0, fileName.length() - 4)
minify(inputArtifact.get().asFile, entry.value, outputs.file("${nameWithoutExtension}-min.jar"))
return
}
}
println "Nothing to minify - using ${fileName} unchanged"
outputs.file(inputArtifact) // (4)
}
private void minify(File artifact, Set<String> keepClasses, File jarFile) {
println "Minifying ${artifact.name}"
// Implementation ...
}
}
abstract class Minify : TransformAction<Minify.Parameters> { // (1)
interface Parameters : TransformParameters { // (2)
@get:Input
var keepClassesByArtifact: Map<String, Set<String>>
}
@get:PathSensitive(PathSensitivity.NAME_ONLY)
@get:InputArtifact
abstract val inputArtifact: Provider<FileSystemLocation>
override
fun transform(outputs: TransformOutputs) {
val fileName = inputArtifact.get().asFile.name
for (entry in parameters.keepClassesByArtifact) { // (3)
if (fileName.startsWith(entry.key)) {
val nameWithoutExtension = fileName.substring(0, fileName.length - 4)
minify(inputArtifact.get().asFile, entry.value, outputs.file("${nameWithoutExtension}-min.jar"))
return
}
}
println("Nothing to minify - using ${fileName} unchanged")
outputs.file(inputArtifact) // (4)
}
private fun minify(artifact: File, keepClasses: Set<String>, jarFile: File) {
println("Minifying ${artifact.name}")
// Implementation ...
}
}
-
Declare the parameter type
-
Interface for the transform parameters
-
Use the parameters
-
Use the unchanged input artifact when not minification is required
Remember that the input artifact is a dependency, which may have its own dependencies.
If your artifact transform needs access to those transitive dependencies, it can declare an abstract getter returning a FileCollection
and annotate it with @InputArtifactDependencies.
When your transform runs, Gradle will inject the transitive dependencies into that FileCollection
property by implementing the getter.
Note that using input artifact dependencies in a transform has performance implications, only inject them when you really need them.
Moreover, artifact transforms can make use of the build cache for their outputs.
To enable the build cache for an artifact transform, add the @CacheableTransform
annotation on the action class.
For cacheable transforms, you must annotate its @InputArtifact property — and any property marked with @InputArtifactDependencies — with normalization annotations such as @PathSensitive.
The following example shows a more complicated transforms. It moves some selected classes of a JAR to a different package, rewriting the byte code of the moved classes and all classes using the moved classes (class relocation). In order to determine the classes to relocate, it looks at the packages of the input artifact and the dependencies of the input artifact. It also does not relocate packages contained in JAR files in an external classpath.
@CacheableTransform // (1)
abstract class ClassRelocator implements TransformAction<Parameters> {
interface Parameters extends TransformParameters { // (2)
@CompileClasspath // (3)
ConfigurableFileCollection getExternalClasspath()
@Input
Property<String> getExcludedPackage()
}
@Classpath // (4)
@InputArtifact
abstract Provider<FileSystemLocation> getPrimaryInput()
@CompileClasspath
@InputArtifactDependencies // (5)
abstract FileCollection getDependencies()
@Override
void transform(TransformOutputs outputs) {
def primaryInputFile = primaryInput.get().asFile
if (parameters.externalClasspath.contains(primaryInput)) { // (6)
outputs.file(primaryInput)
} else {
def baseName = primaryInputFile.name.substring(0, primaryInputFile.name.length - 4)
relocateJar(outputs.file("$baseName-relocated.jar"))
}
}
private relocateJar(File output) {
// implementation...
def relocatedPackages = (dependencies.collectMany { readPackages(it) } + readPackages(primaryInput.get().asFile)) as Set
def nonRelocatedPackages = parameters.externalClasspath.collectMany { readPackages(it) }
def relocations = (relocatedPackages - nonRelocatedPackages).collect { packageName ->
def toPackage = "relocated.$packageName"
println("$packageName -> $toPackage")
new Relocation(packageName, toPackage)
}
new JarRelocator(primaryInput.get().asFile, output, relocations).run()
}
}
@CacheableTransform // (1)
abstract class ClassRelocator : TransformAction<ClassRelocator.Parameters> {
interface Parameters : TransformParameters { // (2)
@get:CompileClasspath // (3)
val externalClasspath: ConfigurableFileCollection
@get:Input
val excludedPackage: Property<String>
}
@get:Classpath // (4)
@get:InputArtifact
abstract val primaryInput: Provider<FileSystemLocation>
@get:CompileClasspath
@get:InputArtifactDependencies // (5)
abstract val dependencies: FileCollection
override
fun transform(outputs: TransformOutputs) {
val primaryInputFile = primaryInput.get().asFile
if (parameters.externalClasspath.contains(primaryInputFile)) { // (6)
outputs.file(primaryInput)
} else {
val baseName = primaryInputFile.name.substring(0, primaryInputFile.name.length - 4)
relocateJar(outputs.file("$baseName-relocated.jar"))
}
}
private fun relocateJar(output: File) {
// implementation...
val relocatedPackages = (dependencies.flatMap { it.readPackages() } + primaryInput.get().asFile.readPackages()).toSet()
val nonRelocatedPackages = parameters.externalClasspath.flatMap { it.readPackages() }
val relocations = (relocatedPackages - nonRelocatedPackages).map { packageName ->
val toPackage = "relocated.$packageName"
println("$packageName -> $toPackage")
Relocation(packageName, toPackage)
}
JarRelocator(primaryInput.get().asFile, output, relocations).run()
}
}
-
Declare the transform cacheable
-
Interface for the transform parameters
-
Declare input type for each parameter
-
Declare a normalization for the input artifact
-
Inject the input artifact dependencies
-
Use the parameters
Registering artifact transforms
You need to register the artifact transform actions, providing parameters if necessary, so that they can be selected when resolving dependencies.
In order to register an artifact transform, you must use registerTransform() within the dependencies {}
block.
There are a few points to consider when using registerTransform()
:
-
The
from
andto
attributes are required. -
The transform action itself can have configuration options. You can configure them with the
parameters {}
block. -
You must register the transform on the project that has the configuration that will be resolved.
-
You can supply any type implementing TransformAction to the
registerTransform()
method.
For example, imagine you want to unpack some dependencies and put the unpacked directories and files on the classpath.
You can do so by registering an artifact transform action of type Unzip
, as shown here:
def artifactType = Attribute.of('artifactType', String)
dependencies {
registerTransform(Unzip) {
from.attribute(artifactType, 'jar')
to.attribute(artifactType, 'java-classes-directory')
}
}
val artifactType = Attribute.of("artifactType", String::class.java)
dependencies {
registerTransform(Unzip::class) {
from.attribute(artifactType, "jar")
to.attribute(artifactType, "java-classes-directory")
}
}
Another example is that you want minify JARs by only keeping some class
files from them.
Note the use of the parameters {}
block to provide the classes to keep in the minified JARs to the Minify
transform.
def artifactType = Attribute.of('artifactType', String)
def minified = Attribute.of('minified', Boolean)
def keepPatterns = [
"guava": [
"com.google.common.base.Optional",
"com.google.common.base.AbstractIterator"
] as Set
]
dependencies {
registerTransform(Minify) {
from.attribute(minified, false).attribute(artifactType, "jar")
to.attribute(minified, true).attribute(artifactType, "jar")
parameters {
keepClassesByArtifact = keepPatterns
}
}
}
val artifactType = Attribute.of("artifactType", String::class.java)
val minified = Attribute.of("minified", Boolean::class.javaObjectType)
val keepPatterns = mapOf(
"guava" to setOf(
"com.google.common.base.Optional",
"com.google.common.base.AbstractIterator"
)
)
dependencies {
registerTransform(Minify::class) {
from.attribute(minified, false).attribute(artifactType, "jar")
to.attribute(minified, true).attribute(artifactType, "jar")
parameters {
keepClassesByArtifact = keepPatterns
}
}
}
Implementing incremental artifact transforms
Similar to incremental tasks, artifact transforms can avoid work by only processing changed files from the last execution. This is done by using the InputChanges interface. For artifact transforms, only the input artifact is an incremental input, and therefore the transform can only query for changes there. In order to use InputChanges in the transform action, inject it into the action. For more information on how to use InputChanges, see the corresponding documentation for incremental tasks.
Here is an example of an incremental transform that counts the lines of code in Java source files:
abstract class CountLoc implements TransformAction<TransformParameters.None> {
@Inject
abstract InputChanges getInputChanges()
@PathSensitive(PathSensitivity.RELATIVE)
@InputArtifact
abstract Provider<FileSystemLocation> getInput()
@Override
void transform(TransformOutputs outputs) { // (1)
def outputDir = outputs.dir("${input.get().asFile.name}.loc")
println("Running transform on ${input.get().asFile.name}, incremental: ${inputChanges.incremental}")
inputChanges.getFileChanges(input).forEach { change -> // (2)
def changedFile = change.file
if (change.fileType != FileType.FILE) {
return
}
def outputLocation = new File(outputDir, "${change.normalizedPath}.loc")
switch (change.changeType) {
case ADDED:
case MODIFIED:
println("Processing file ${changedFile.name}")
outputLocation.parentFile.mkdirs()
outputLocation.text = changedFile.readLines().size()
case REMOVED:
println("Removing leftover output file ${outputLocation.name}")
outputLocation.delete()
}
}
}
}
abstract class CountLoc : TransformAction<TransformParameters.None> {
@get:Inject
abstract val inputChanges: InputChanges
@get:PathSensitive(PathSensitivity.RELATIVE)
@get:InputArtifact
abstract val input: Provider<FileSystemLocation>
override
fun transform(outputs: TransformOutputs) { // (1)
val outputDir = outputs.dir("${input.get().asFile.name}.loc")
println("Running transform on ${input.get().asFile.name}, incremental: ${inputChanges.isIncremental}")
inputChanges.getFileChanges(input).forEach { change -> // (2)
val changedFile = change.file
if (change.fileType != FileType.FILE) {
return@forEach
}
val outputLocation = outputDir.resolve("${change.normalizedPath}.loc")
when (change.changeType) {
ChangeType.ADDED, ChangeType.MODIFIED -> {
println("Processing file ${changedFile.name}")
outputLocation.parentFile.mkdirs()
outputLocation.writeText(changedFile.readLines().size.toString())
}
ChangeType.REMOVED -> {
println("Removing leftover output file ${outputLocation.name}")
outputLocation.delete()
}
}
}
}
}
-
Inject
InputChanges
-
Query for changes in the input artifact