diff --git a/.github/dco.yml b/.github/dco.yml
new file mode 100644
index 000000000..0c4b142e9
--- /dev/null
+++ b/.github/dco.yml
@@ -0,0 +1,2 @@
+require:
+ members: false
diff --git a/.github/workflows/stale.yml b/.github/workflows/stale.yml
new file mode 100644
index 000000000..81f84f934
--- /dev/null
+++ b/.github/workflows/stale.yml
@@ -0,0 +1,27 @@
+# This workflow warns and then closes issues and PRs that have had no activity for a specified amount of time.
+#
+# You can adjust the behavior by modifying this file.
+# For more information, see:
+# https://github.com/actions/stale
+name: Mark stale issues and pull requests
+
+on:
+ schedule:
+ - cron: '42 17 * * *'
+ workflow_dispatch:
+jobs:
+ stale:
+
+ runs-on: ubuntu-latest
+ permissions:
+ issues: write
+ pull-requests: write
+
+ steps:
+ - uses: actions/stale@v5
+ with:
+ repo-token: ${{ secrets.GITHUB_TOKEN }}
+ stale-issue-message: 'This issue has been stale for over 60 days'
+ stale-pr-message: 'This PR has been stale for over 60 days'
+ stale-issue-label: 'no-issue-activity'
+ stale-pr-label: 'no-pr-activity'
diff --git a/.gitignore b/.gitignore
index b2c31d2dc..31a040841 100644
--- a/.gitignore
+++ b/.gitignore
@@ -35,5 +35,7 @@ pom.xml.versionsBackup
node
node_modules
build
-package.json
+/package.json
package-lock.json
+*samconfig.toml
+*.aws-sam/
diff --git a/README.adoc b/README.adoc
index cb06b375d..072334cd4 100644
--- a/README.adoc
+++ b/README.adoc
@@ -23,245 +23,91 @@ image::https://travis-ci.org/spring-cloud/spring-cloud-function.svg?branch={bran
= Building
:page-section-summary-toc: 1
-:spring-cloud-build-branch: main
-
-Spring Cloud is released under the non-restrictive Apache 2.0 license,
-and follows a very standard Github development process, using Github
-tracker for issues and merging pull requests into main. If you want
-to contribute even something trivial please do not hesitate, but
-follow the guidelines below.
-
-[[sign-the-contributor-license-agreement]]
-== Sign the Contributor License Agreement
-
-Before we accept a non-trivial patch or pull request we will need you to sign the
-https://cla.pivotal.io/sign/spring[Contributor License Agreement].
-Signing the contributor's agreement does not grant anyone commit rights to the main
-repository, but it does mean that we can accept your contributions, and you will get an
-author credit if we do. Active contributors might be asked to join the core team, and
-given the ability to merge pull requests.
-
-[[code-of-conduct]]
-== Code of Conduct
-This project adheres to the Contributor Covenant https://github.com/spring-cloud/spring-cloud-build/blob/main/docs/src/main/asciidoc/code-of-conduct.adoc[code of
-conduct]. By participating, you are expected to uphold this code. Please report
-unacceptable behavior to spring-code-of-conduct@pivotal.io.
-
-[[code-conventions-and-housekeeping]]
-== Code Conventions and Housekeeping
-None of these is essential for a pull request, but they will all help. They can also be
-added after the original pull request but before a merge.
-
-* Use the Spring Framework code format conventions. If you use Eclipse
- you can import formatter settings using the
- `eclipse-code-formatter.xml` file from the
- https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/spring-cloud-dependencies-parent/eclipse-code-formatter.xml[Spring
- Cloud Build] project. If using IntelliJ, you can use the
- https://plugins.jetbrains.com/plugin/6546[Eclipse Code Formatter
- Plugin] to import the same file.
-* Make sure all new `.java` files to have a simple Javadoc class comment with at least an
- `@author` tag identifying you, and preferably at least a paragraph on what the class is
- for.
-* Add the ASF license header comment to all new `.java` files (copy from existing files
- in the project)
-* Add yourself as an `@author` to the .java files that you modify substantially (more
- than cosmetic changes).
-* Add some Javadocs and, if you change the namespace, some XSD doc elements.
-* A few unit tests would help a lot as well -- someone has to do it.
-* If no-one else is using your branch, please rebase it against the current main (or
- other target branch in the main project).
-* When writing a commit message please follow https://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html[these conventions],
- if you are fixing an existing issue please add `Fixes gh-XXXX` at the end of the commit
- message (where XXXX is the issue number).
-
-[[checkstyle]]
-== Checkstyle
-
-Spring Cloud Build comes with a set of checkstyle rules. You can find them in the `spring-cloud-build-tools` module. The most notable files under the module are:
-
-.spring-cloud-build-tools/
-----
-└── src
- ├── checkstyle
- │ └── checkstyle-suppressions.xml <3>
- └── main
- └── resources
- ├── checkstyle-header.txt <2>
- └── checkstyle.xml <1>
-----
-<1> Default Checkstyle rules
-<2> File header setup
-<3> Default suppression rules
-
-[[checkstyle-configuration]]
-=== Checkstyle configuration
-
-Checkstyle rules are *disabled by default*. To add checkstyle to your project just define the following properties and plugins.
-
-.pom.xml
-----
-
-true <1>
- true
- <2>
- true
- <3>
-
-
-
-
- <4>
- io.spring.javaformat
- spring-javaformat-maven-plugin
-
- <5>
- org.apache.maven.plugins
- maven-checkstyle-plugin
-
-
-
-
-
- <5>
- org.apache.maven.plugins
- maven-checkstyle-plugin
-
-
-
-
-----
-<1> Fails the build upon Checkstyle errors
-<2> Fails the build upon Checkstyle violations
-<3> Checkstyle analyzes also the test sources
-<4> Add the Spring Java Format plugin that will reformat your code to pass most of the Checkstyle formatting rules
-<5> Add checkstyle plugin to your build and reporting phases
-
-If you need to suppress some rules (e.g. line length needs to be longer), then it's enough for you to define a file under `${project.root}/src/checkstyle/checkstyle-suppressions.xml` with your suppressions. Example:
-
-.projectRoot/src/checkstyle/checkstyle-suppresions.xml
-----
-
-
-
-
-
-
-----
-
-It's advisable to copy the `${spring-cloud-build.rootFolder}/.editorconfig` and `${spring-cloud-build.rootFolder}/.springformat` to your project. That way, some default formatting rules will be applied. You can do so by running this script:
-
-```bash
-$ curl https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/.editorconfig -o .editorconfig
-$ touch .springformat
-```
+:jdkversion: 17
-[[ide-setup]]
-== IDE setup
+[[basic-compile-and-test]]
+== Basic Compile and Test
-[[intellij-idea]]
-=== Intellij IDEA
+To build the source you will need to install JDK {jdkversion}.
-In order to setup Intellij you should import our coding conventions, inspection profiles and set up the checkstyle plugin.
-The following files can be found in the https://github.com/spring-cloud/spring-cloud-build/tree/main/spring-cloud-build-tools[Spring Cloud Build] project.
+Spring Cloud uses Maven for most build-related activities, and you
+should be able to get off the ground quite quickly by cloning the
+project you are interested in and typing
-.spring-cloud-build-tools/
----
-└── src
- ├── checkstyle
- │ └── checkstyle-suppressions.xml <3>
- └── main
- └── resources
- ├── checkstyle-header.txt <2>
- ├── checkstyle.xml <1>
- └── intellij
- ├── Intellij_Project_Defaults.xml <4>
- └── Intellij_Spring_Boot_Java_Conventions.xml <5>
+$ ./mvnw install
----
-<1> Default Checkstyle rules
-<2> File header setup
-<3> Default suppression rules
-<4> Project defaults for Intellij that apply most of Checkstyle rules
-<5> Project style conventions for Intellij that apply most of Checkstyle rules
-
-.Code style
-
-image::https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/docs/modules/ROOT/assets/images/intellij-code-style.png[Code style]
-
-Go to `File` -> `Settings` -> `Editor` -> `Code style`. There click on the icon next to the `Scheme` section. There, click on the `Import Scheme` value and pick the `Intellij IDEA code style XML` option. Import the `spring-cloud-build-tools/src/main/resources/intellij/Intellij_Spring_Boot_Java_Conventions.xml` file.
-.Inspection profiles
-
-image::https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/docs/modules/ROOT/assets/images/intellij-inspections.png[Code style]
-
-Go to `File` -> `Settings` -> `Editor` -> `Inspections`. There click on the icon next to the `Profile` section. There, click on the `Import Profile` and import the `spring-cloud-build-tools/src/main/resources/intellij/Intellij_Project_Defaults.xml` file.
-
-.Checkstyle
-
-To have Intellij work with Checkstyle, you have to install the `Checkstyle` plugin. It's advisable to also install the `Assertions2Assertj` to automatically convert the JUnit assertions
-
-image::https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/docs/modules/ROOT/assets/images/intellij-checkstyle.png[Checkstyle]
-
-Go to `File` -> `Settings` -> `Other settings` -> `Checkstyle`. There click on the `+` icon in the `Configuration file` section. There, you'll have to define where the checkstyle rules should be picked from. In the image above, we've picked the rules from the cloned Spring Cloud Build repository. However, you can point to the Spring Cloud Build's GitHub repository (e.g. for the `checkstyle.xml` : `https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/spring-cloud-build-tools/src/main/resources/checkstyle.xml`). We need to provide the following variables:
-
-- `checkstyle.header.file` - please point it to the Spring Cloud Build's, `spring-cloud-build-tools/src/main/resources/checkstyle-header.txt` file either in your cloned repo or via the `https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/spring-cloud-build-tools/src/main/resources/checkstyle-header.txt` URL.
-- `checkstyle.suppressions.file` - default suppressions. Please point it to the Spring Cloud Build's, `spring-cloud-build-tools/src/checkstyle/checkstyle-suppressions.xml` file either in your cloned repo or via the `https://raw.githubusercontent.com/spring-cloud/spring-cloud-build/main/spring-cloud-build-tools/src/checkstyle/checkstyle-suppressions.xml` URL.
-- `checkstyle.additional.suppressions.file` - this variable corresponds to suppressions in your local project. E.g. you're working on `spring-cloud-contract`. Then point to the `project-root/src/checkstyle/checkstyle-suppressions.xml` folder. Example for `spring-cloud-contract` would be: `/home/username/spring-cloud-contract/src/checkstyle/checkstyle-suppressions.xml`.
-
-IMPORTANT: Remember to set the `Scan Scope` to `All sources` since we apply checkstyle rules for production and test sources.
-
-[[duplicate-finder]]
-== Duplicate Finder
-
-Spring Cloud Build brings along the `basepom:duplicate-finder-maven-plugin`, that enables flagging duplicate and conflicting classes and resources on the java classpath.
-
-[[duplicate-finder-configuration]]
-=== Duplicate Finder configuration
-
-Duplicate finder is *enabled by default* and will run in the `verify` phase of your Maven build, but it will only take effect in your project if you add the `duplicate-finder-maven-plugin` to the `build` section of the projecst's `pom.xml`.
-
-.pom.xml
-[source,xml]
-----
-
-
-
- org.basepom.maven
- duplicate-finder-maven-plugin
-
-
-
+NOTE: You can also install Maven (>=3.3.3) yourself and run the `mvn` command
+in place of `./mvnw` in the examples below. If you do that you also
+might need to add `-P spring` if your local Maven settings do not
+contain repository declarations for spring pre-release artifacts.
+
+NOTE: Be aware that you might need to increase the amount of memory
+available to Maven by setting a `MAVEN_OPTS` environment variable with
+a value like `-Xmx512m -XX:MaxPermSize=128m`. We try to cover this in
+the `.mvn` configuration, so if you find you have to do it to make a
+build succeed, please raise a ticket to get the settings added to
+source control.
+
+The projects that require middleware (i.e. Redis) for testing generally
+require that a local instance of [Docker](https://www.docker.com/get-started) is installed and running.
+
+[[documentation]]
+== Documentation
+
+The spring-cloud-build module has a "docs" profile, and if you switch
+that on it will try to build asciidoc sources using https://docs.antora.org/antora/latest/[Antora] from
+`modules/ROOT/`.
+
+As part of that process it will look for a
+`docs/src/main/asciidoc/README.adoc` and process it by loading all the includes, but not
+parsing or rendering it, just copying it to `${main.basedir}`
+(defaults to `$\{basedir}`, i.e. the root of the project). If there are
+any changes in the README it will then show up after a Maven build as
+a modified file in the correct place. Just commit it and push the change.
+
+[[working-with-the-code]]
+== Working with the code
+If you don't have an IDE preference we would recommend that you use
+https://www.springsource.com/developer/sts[Spring Tools Suite] or
+https://eclipse.org[Eclipse] when working with the code. We use the
+https://eclipse.org/m2e/[m2eclipse] eclipse plugin for maven support. Other IDEs and tools
+should also work without issue as long as they use Maven 3.3.3 or better.
+
+[[activate-the-spring-maven-profile]]
+=== Activate the Spring Maven profile
+Spring Cloud projects require the 'spring' Maven profile to be activated to resolve
+the spring milestone and snapshot repositories. Use your preferred IDE to set this
+profile to be active, or you may experience build errors.
+
+[[importing-into-eclipse-with-m2eclipse]]
+=== Importing into eclipse with m2eclipse
+We recommend the https://eclipse.org/m2e/[m2eclipse] eclipse plugin when working with
+eclipse. If you don't already have m2eclipse installed it is available from the "eclipse
+marketplace".
+
+NOTE: Older versions of m2e do not support Maven 3.3, so once the
+projects are imported into Eclipse you will also need to tell
+m2eclipse to use the right profile for the projects. If you
+see many different errors related to the POMs in the projects, check
+that you have an up to date installation. If you can't upgrade m2e,
+add the "spring" profile to your `settings.xml`. Alternatively you can
+copy the repository settings from the "spring" profile of the parent
+pom into your `settings.xml`.
+
+[[importing-into-eclipse-without-m2eclipse]]
+=== Importing into eclipse without m2eclipse
+If you prefer not to use m2eclipse you can generate eclipse project metadata using the
+following command:
+
+[indent=0]
----
-
-For other properties, we have set defaults as listed in the https://github.com/basepom/duplicate-finder-maven-plugin/wiki[plugin documentation].
-
-You can easily override them but setting the value of the selected property prefixed with `duplicate-finder-maven-plugin`. For example, set `duplicate-finder-maven-plugin.skip` to `true` in order to skip duplicates check in your build.
-
-If you need to add `ignoredClassPatterns` or `ignoredResourcePatterns` to your setup, make sure to add them in the plugin configuration section of your project:
-
-[source,xml]
+ $ ./mvnw eclipse:eclipse
----
-
-
-
- org.basepom.maven
- duplicate-finder-maven-plugin
-
-
- org.joda.time.base.BaseDateTime
- .*module-info
-
-
- changelog.txt
-
-
-
-
-
-
-----
+The generated eclipse projects can be imported by selecting `import existing projects`
+from the `file` menu.
[[contributing]]
@@ -276,21 +122,17 @@ tracker for issues and merging pull requests into main. If you want
to contribute even something trivial please do not hesitate, but
follow the guidelines below.
-[[sign-the-contributor-license-agreement]]
-== Sign the Contributor License Agreement
+[[developer-certificate-of-origin]]
+== Developer Certificate of Origin (DCO)
-Before we accept a non-trivial patch or pull request we will need you to sign the
-https://cla.pivotal.io/sign/spring[Contributor License Agreement].
-Signing the contributor's agreement does not grant anyone commit rights to the main
-repository, but it does mean that we can accept your contributions, and you will get an
-author credit if we do. Active contributors might be asked to join the core team, and
-given the ability to merge pull requests.
+All commits must include a __Signed-off-by__ trailer at the end of each commit message to indicate that the contributor agrees to the Developer Certificate of Origin.
+For additional details, please refer to the blog post https://spring.io/blog/2025/01/06/hello-dco-goodbye-cla-simplifying-contributions-to-spring[Hello DCO, Goodbye CLA: Simplifying Contributions to Spring].
[[code-of-conduct]]
== Code of Conduct
-This project adheres to the Contributor Covenant https://github.com/spring-cloud/spring-cloud-build/blob/main/docs/src/main/asciidoc/code-of-conduct.adoc[code of
+This project adheres to the Contributor Covenant https://github.com/spring-cloud/spring-cloud-build/blob/main/docs/modules/ROOT/partials/code-of-conduct.adoc[code of
conduct]. By participating, you are expected to uphold this code. Please report
-unacceptable behavior to spring-code-of-conduct@pivotal.io.
+unacceptable behavior to code-of-conduct@spring.io.
[[code-conventions-and-housekeeping]]
== Code Conventions and Housekeeping
@@ -464,7 +306,7 @@ Spring Cloud Build brings along the `basepom:duplicate-finder-maven-plugin`, th
[[duplicate-finder-configuration]]
=== Duplicate Finder configuration
-Duplicate finder is *enabled by default* and will run in the `verify` phase of your Maven build, but it will only take effect in your project if you add the `duplicate-finder-maven-plugin` to the `build` section of the projecst's `pom.xml`.
+Duplicate finder is *enabled by default* and will run in the `verify` phase of your Maven build, but it will only take effect in your project if you add the `duplicate-finder-maven-plugin` to the `build` section of the project's `pom.xml`.
.pom.xml
[source,xml]
diff --git a/docs/antora-playbook.yml b/docs/antora-playbook.yml
index 328a2e92a..158d51e05 100644
--- a/docs/antora-playbook.yml
+++ b/docs/antora-playbook.yml
@@ -1,13 +1,7 @@
antora:
extensions:
- - '@springio/antora-extensions/partial-build-extension'
- - require: '@springio/antora-extensions/latest-version-extension'
- - require: '@springio/antora-extensions/inject-collector-cache-config-extension'
- - '@antora/collector-extension'
- - '@antora/atlas-extension'
- - require: '@springio/antora-extensions/root-component-extension'
+ - require: '@springio/antora-extensions'
root_component_name: 'cloud-function'
- - '@springio/antora-extensions/static-page-extension'
site:
title: Spring Cloud Function
url: https://docs.spring.io/spring-cloud-function/reference/
@@ -36,4 +30,4 @@ runtime:
format: pretty
ui:
bundle:
- url: https://github.com/spring-io/antora-ui-spring/releases/download/v0.4.11/ui-bundle.zip
+ url: https://github.com/spring-io/antora-ui-spring/releases/download/v0.4.15/ui-bundle.zip
diff --git a/docs/modules/ROOT/assets/images/aws_spring_lambda_edit.png b/docs/modules/ROOT/assets/images/aws_spring_lambda_edit.png
new file mode 100644
index 000000000..370aeaa80
Binary files /dev/null and b/docs/modules/ROOT/assets/images/aws_spring_lambda_edit.png differ
diff --git a/docs/modules/ROOT/assets/images/aws_spring_lambda_test.png b/docs/modules/ROOT/assets/images/aws_spring_lambda_test.png
new file mode 100644
index 000000000..99ed2392f
Binary files /dev/null and b/docs/modules/ROOT/assets/images/aws_spring_lambda_test.png differ
diff --git a/docs/modules/ROOT/pages/adapters/aws-intro.adoc b/docs/modules/ROOT/pages/adapters/aws-intro.adoc
index 86419bdcf..946bbf8c0 100644
--- a/docs/modules/ROOT/pages/adapters/aws-intro.adoc
+++ b/docs/modules/ROOT/pages/adapters/aws-intro.adoc
@@ -1,18 +1,24 @@
[[aws-lambda]]
= AWS Lambda
+:page-aliases: adapters/aws.adoc
The https://aws.amazon.com/[AWS] adapter takes a Spring Cloud Function app and converts it to a form that can run in AWS Lambda.
-The details of how to get stared with AWS Lambda is out of scope of this document, so the expectation is that user has some familiarity with
-AWS and AWS Lambda and wants to learn what additional value spring provides.
+
+In general, there are two ways to run Spring applications on AWS Lambda:
+
+1. Use the AWS Lambda adapter via Spring Cloud Function to implement a functional approach as outlined below. This is a good fit for single responsibility APIs and event & messaging-based systems such as handling messages from an Amazon SQS or Amazon MQ queue, an Apache Kafka stream, or reacting to file uploads in Amazon S3.
+2. Run a Spring Boot Web application on AWS Lambda via the https://github.com/aws/serverless-java-container[Serverless Java container project]. This is a good fit for migrations of existing Spring applications to AWS Lambda or if you build sophisticated APIs with multiple API endpoints and want to maintain the familiar `RestController` approach. This approach is outlined in more detail in <>.
+
+
+The following guide expects that you have a basic understanding of AWS and AWS Lambda and focuses on the additional value that Spring provides. The details on how to get started with AWS Lambda are out of scope of this document. If you want to learn more, you can navigate to https://docs.aws.amazon.com/lambda/latest/dg/concepts-basics.html[basic AWS Lambda concepts] or a complete https://catalog.workshops.aws/java-on-aws/[Java on AWS overview].
[[getting-started]]
== Getting Started
-One of the goals of Spring Cloud Function framework is to provide necessary infrastructure elements to enable a _simple function application_
-to interact in a certain way in a particular environment.
-A simple function application (in context or Spring) is an application that contains beans of type Supplier, Function or Consumer.
-So, with AWS it means that a simple function bean should somehow be recognised and executed in AWS Lambda environment.
+One of the goals of Spring Cloud Function framework is to provide the necessary infrastructure elements to enable a _simple functional application_ to be compatible with a particular environment (such as AWS Lambda).
+
+In the context of Spring, a simple functional application contains beans of type `Supplier`, `Function` or `Consumer`.
Let’s look at the example:
@@ -32,93 +38,239 @@ public class FunctionConfiguration {
}
----
-It shows a complete Spring Boot application with a function bean defined in it. What’s interesting is that on the surface this is just
-another boot app, but in the context of AWS Adapter it is also a perfectly valid AWS Lambda application. No other code or configuration
-is required. All you need to do is package it and deploy it, so let’s look how we can do that.
+You can see a complete Spring Boot application with a function bean defined in it. On the surface this is just another Spring Boot app. However, when adding the Spring Cloud Function AWS Adapter to the project it will become a perfectly valid AWS Lambda application:
+
+[source, xml]
+----
+
+
+
+ org.springframework.cloud
+ spring-cloud-function-adapter-aws
+
+
+----
+
+No other code or configuration is required. We’ve provided a sample project ready to be built and deployed. You can access it https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples/function-sample-aws[in the official Spring Cloud function example repository].
-To make things simpler we’ve provided a sample project ready to be built and deployed and you can access it
-https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples/function-sample-aws[here].
+You simply execute `mvn clean package` to generate the JAR file. All the necessary maven plugins have already been setup to generate
+an appropriate AWS deployable JAR file. (You can read more details about the JAR layout in <>).
-You simply execute `./mvnw clean package` to generate JAR file. All the necessary maven plugins have already been setup to generate
-appropriate AWS deployable JAR file. (You can read more details about JAR layout in <>).
+[[aws-function-handlers]]
+=== AWS Lambda Function Handler
-Then you have to upload the JAR file (via AWS dashboard or AWS CLI) to AWS.
+In contrast to traditional web applications that expose their functionality via a listener on a given HTTP port (80, 443), AWS Lambda functions are invoked at a predefined entry point, called the Lambda https://docs.aws.amazon.com/lambda/latest/dg/java-handler.html[function handler].
-When ask about _handler_ you specify `org.springframework.cloud.function.adapter.aws.FunctionInvoker::handleRequest` which is a generic request handler.
+We recommend using the built-in `org.springframework.cloud.function.adapter.aws.FunctionInvoker` handler to streamline the integration with AWS Lambda. It provides advanced features such as multi-function routing, decoupling from AWS specifics, and POJO serialization out of the box. Please refer to the <> and <> sections to learn more.
-image::AWS-deploy.png[width=800,scaledwidth="75%",align="center"]
+[[deployment-options]]
+=== Deployment
-That is all. Save and execute the function with some sample data which for this function is expected to be a
-String which function will uppercase and return back.
+After building the application, you can deploy the JAR file either manually via the AWS console, the AWS Command Line Interface (CLI), or Infrastructure as Code (IaC) tools such as https://aws.amazon.com/serverless/sam/[AWS Serverless Application Model (AWS SAM)], https://aws.amazon.com/cdk/[AWS Cloud Development Kit (AWS CDK)], https://aws.amazon.com/cloudformation/[AWS CloudFormation], or https://docs.aws.amazon.com/prescriptive-guidance/latest/choose-iac-tool/terraform.html[Terraform].
-While `org.springframework.cloud.function.adapter.aws.FunctionInvoker` is a general purpose AWS's `RequestHandler` implementation aimed at completely
-isolating you from the specifics of AWS Lambda API, for some cases you may want to specify which specific AWS's `RequestHandler` you want
-to use. The next section will explain you how you can accomplish just that.
+To create a Hello world Lambda function with the AWS console
+1. Open the https://console.aws.amazon.com/lambda/home#/functions[Functions page of the Lambda console].
+2. Choose _Create function_.
+3. Select _Author from scratch_.
+4. For Function name, enter `MySpringLambdaFunction`.
+5. For Runtime, choose _Java 21_.
+6. Choose _Create function_.
+
+To upload your code and test the function:
+
+1. Upload the previously created JAR file for example `target/function-sample-aws-0.0.1-SNAPSHOT-aws.jar`.
+
+2. Provide the entry handler method `org.springframework.cloud.function.adapter.aws.FunctionInvoker::handleRequest`.
+
+3. Navigate to the "Test" tab and click the "Test" button. The function should return with the provided JSON payload in uppercase.
+
+image::aws_spring_lambda_edit.png[width=800,scaledwidth="75%",align="center"]
+
+image::aws_spring_lambda_test.png[width=800,scaledwidth="75%",align="center"]
+
+To automate your deployment with Infrastructure as Code (IaC) tools please refer to https://docs.aws.amazon.com/lambda/latest/dg/foundation-iac.html[the official AWS documentation].
[[aws-request-handlers]]
-=== AWS Request Handlers
+== AWS Request Handlers
-While AWS Lambda allows you to implement various `RequestHandlers`, with Spring Cloud Function you don't need to implement any, and instead use the provided
- `org.springframework.cloud.function.adapter.aws.FunctionInvoker` which is the implementation of AWS's `RequestStreamHandler`.
-User doesn't need to do anything other then specify it as 'handler' on AWS dashboard when deploying function.
-It will handle most of the case including Kinesis, streaming etc. .
+As discussed in the getting started section, AWS Lambda functions are invoked at a predefined entry point, called the https://docs.aws.amazon.com/lambda/latest/dg/java-handler.html[Lambda function handler]. In its simplest form this can be a Java method reference. In the above example that would be `com.my.package.FunctionConfiguration::uppercase`. This configuration is needed to advise AWS Lambda which Java method to call in the provided JAR.
+
+When a Lambda function is invoked, it passes an additional request payload and context object to this handler method. The request payload varies based on the AWS service (Amazon API Gateway, Amazon S3, Amazon SQS, Apache Kafka etc.) that triggered the function. The context object provides additional information about the Lambda function, the invocation and the environment, for example a unique request id (https://docs.aws.amazon.com/lambda/latest/dg/java-context.html[see also Java context in the official documentation]).
+
+AWS provides predefined handler interfaces (called `RequestHandler` or `RequestStreamHandler`) to deal with payload and context objects via the aws-lambda-java-events and aws-lambda-java-core libraries.
+
+Spring Cloud Function already implements these interfaces and provides a `org.springframework.cloud.function.adapter.aws.FunctionInvoker` to completely abstract your function code
+from the specifics of AWS Lambda. This allows you to just switch the entry point depending on which platform you run your functions.
+
+However, for some use cases you want to integrate deeply with the AWS environment. For example, when your function is triggered by an Amazon S3 file upload you might want to access specific Amazon S3 properties. Or, if you want to return a partial batch response when processing items from an Amazon SQS queue. In that case you can still leverage the generic `org.springframework.cloud.function.adapter.aws.FunctionInvoker` but you will work with the dedicated AWS objects from within your function code:
+
+[source, java]
+----
+@Bean
+public Function processS3Event() {}
+@Bean
+public Function processSQSEvent() {}
-If your app has more than one `@Bean` of type `Function` etc. then you can choose the one to use by configuring `spring.cloud.function.definition`
-property or environment variable. The functions are extracted from the Spring Cloud `FunctionCatalog`. In the event you don't specify `spring.cloud.function.definition`
-the framework will attempt to find a default following the search order where it searches first for `Function` then `Consumer` and finally `Supplier`).
+----
[[type-conversion]]
=== Type Conversion
-Spring Cloud Function will attempt to transparently handle type conversion between the raw
+Another benefit of leveraging the built-in `FunctionInvoker` is that Spring Cloud Function will attempt to transparently handle type conversion between the raw
input stream and types declared by your function.
-For example, if your function signature is as such `Function` we will attempt to convert
-incoming stream event to an instance of `Foo`.
+For example, if your function signature is `Function` it will attempt to convert the incoming stream event to an instance of `Foo`. This is especially helpful in API-triggered Lambda functions where the request body represents a business object and is not tied to AWS specifics.
-In the event type is not known or can not be determined (e.g., `Function, ?>`) we will attempt to
+If the event type is not known or can not be determined (e.g., `Function, ?>`) Spring Cloud Function will attempt to
convert an incoming stream event to a generic `Map`.
[[raw-input]]
=== Raw Input
There are times when you may want to have access to a raw input. In this case all you need is to declare your
-function signature to accept `InputStream`. For example, `Function`. In this case
-we will not attempt any conversion and will pass the raw input directly to a function.
-
+function signature to accept `InputStream`, for example `Function`.
+If specified, Spring Cloud function will not attempt any conversion and will pass the raw input directly to the function.
[[aws-function-routing]]
-=== AWS Function Routing
+== AWS Function Routing
+
+One of the core features of Spring Cloud Function is https://docs.spring.io/spring-cloud-function/docs/{project-version}/reference/html/spring-cloud-function.html#_function_routing_and_filtering[routing]. This capability allows you to have one special Java method (acting as a https://docs.aws.amazon.com/lambda/latest/dg/java-handler.html[Lambda function handler]) to delegate to other internal methods. You have already seen this in action when the generic `FunctionInvoker` automatically routed the requests to your `uppercase` function in the <> section.
+
+By default, if your app has more than one `@Bean` of type `Function` etc. they are extracted from the Spring Cloud `FunctionCatalog` and the framework will attempt to find a default following the search order where it searches first for `Function` then `Consumer` and finally `Supplier`. These default routing capabilities are needed because `FunctionInvoker` can not determine which function to bind, so it defaults internally to `RoutingFunction`. It is recommended to provide additional routing instructions https://docs.spring.io/spring-cloud-function/docs/{project-version}/reference/html/spring-cloud-function.html#_function_routing_and_filtering[using several mechanisms] (see https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws-routing[sample] for more details).
+
+The right routing mechanism depends on your preference to deploy your Spring Cloud Function project as a single or multiple Lambda functions.
+
+[[aws-function-routing-single-multi]]
+=== Single Function vs. Multiple Functions
+
+If you implement multiple Java methods in the same Spring Cloud Function project, for example `uppercase` and `lowercase`, you either deploy two separate Lambda functions with static routing information or you provide a dynamic routing method that decides which method to call during runtime. Let's look at both approaches.
+
+1. Deploying two separate AWS Lambda functions makes sense if you have different scaling, configuration or permission requirements per function. For example, if you create two Java methods `readObjectFromAmazonS3` and `writeToAmazonDynamoDB` in the same Spring Cloud Function project, you might want to create two separate Lambda functions. This is because they need different permissions to talk to either S3 or DynamoDB or their load pattern and memory configurations highly vary. In general, this approach is also recommended for messaging based applications where you read from a stream or a queue since you have a dedicated configuration per https://docs.aws.amazon.com/lambda/latest/dg/invocation-eventsourcemapping.html[Lambda Event Source mapping].
+
+2. A single Lambda function is a valid approach when multiple Java methods share the same permission set or provide a cohesive business functionality. For example a CRUD-based Spring Cloud Function project with `createPet`, `updatePet`, `readPet` and `deletePet` methods that all talk to the same DynamoDB table and have a similar usage pattern. Using a single Lambda function will improve deployment simplicity, cohesion and code reuse for shared classes (`PetEntity`). In addition, it can reduce cold starts between sequential invocations because a `readPet` followed by `writePet` will most likely hit an already running https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtime-environment.html[Lambda execution environment]. When you build more sophisticated APIs however, or you want to leverage a `@RestController` approach you may also want to evaluate the <> option.
+
+If you favor the first approach you can also create two separate Spring Cloud Function projects and deploy them individually. This can be beneficial if different teams are responsible for maintaining and deploying the functions. However, in that case you need to deal with sharing cross-cutting concerns such as helper methods or entity classes between them. In general, we advise applying the same software modularity principles to your functional projects as you do for traditional web-based applications. For additional information on how to choose the right approach you can refer to https://aws.amazon.com/blogs/compute/comparing-design-approaches-for-building-serverless-microservices/[Comparing design approaches for serverless microservices].
+
+After the decision has been made you can benefit from the following routing mechanisms.
+
+[[aws-function-routing-multi]]
+=== Routing for multiple Lambda functions
+
+If you have decided to deploy your single Spring Cloud Function project (JAR) to multiple Lambda functions you need to provide a hint on which specific method to call, for example `uppercase` or `lowercase`. You can use https://docs.aws.amazon.com/lambda/latest/dg/configuration-envvars.html[AWS Lambda environment variables] to provide the routing instructions.
+
+Note that AWS does not allow dots `.` and/or hyphens `-` in the name of the environment variable. You can benefit from Spring Boot support and simply substitute dots with underscores and hyphens with camel case. So for example `spring.cloud.function.definition` becomes `spring_cloud_function_definition` and `spring.cloud.function.routing-expression` becomes `spring_cloud_function_routingExpression`.
+
+Therefore, a configuration for a single Spring Cloud project with two methods deployed to separate AWS Lambda functions can look like this:
+
+[source, java]
+----
+@SpringBootApplication
+public class FunctionConfiguration {
+
+ public static void main(String[] args) {
+ SpringApplication.run(FunctionConfiguration.class, args);
+ }
+
+ @Bean
+ public Function uppercase() {
+ return value -> value.toUpperCase();
+ }
+
+ @Bean
+ public Function lowercase() {
+ return value -> value.toLowerCase();
+ }
+}
+----
+
+[source, yaml]
+----
+AWSTemplateFormatVersion: '2010-09-09'
+Transform: AWS::Serverless-2016-10-31
+
+Resources:
+ MyUpperCaseLambda:
+ Type: AWS::Serverless::Function
+ Properties:
+ Handler: org.springframework.cloud.function.adapter.aws.FunctionInvoker
+ Runtime: java21
+ MemorySize: 512
+ CodeUri: target/function-sample-aws-0.0.1-SNAPSHOT-aws.jar
+ Environment:
+ Variables:
+ spring_cloud_function_definition: uppercase
+
+ MyLowerCaseLambda:
+ Type: AWS::Serverless::Function
+ Properties:
+ Handler: org.springframework.cloud.function.adapter.aws.FunctionInvoker
+ Runtime: java21
+ MemorySize: 512
+ CodeUri: target/function-sample-aws-0.0.1-SNAPSHOT-aws.jar
+ Environment:
+ Variables:
+ spring_cloud_function_definition: lowercase
+
+----
+
+You may ask - why not use the Lambda function handler and point the entry method directly to `uppercase` and `lowercase`? In a Spring Cloud Function project it is recommended to use the built-in `FunctionInvoker` as outlined in <>. Therefore, we provide the routing definition via the environment variables.
+
+
+[[aws-function-routing-single]]
+=== Routing within a single Lambda function
+
+If you have decided to deploy your Spring Cloud Function project with multiple methods (`uppercase` or `lowercase`) to a single Lambda function you need a more dynamic routing approach. Since `application.properties` and environment variables are defined at build or deployment time you can't use them for a single function scenario. In this case you can leverage `MessagingRoutingCallback` or `Message Headers` as outlined in the https://docs.spring.io/spring-cloud-function/docs/{project-version}/reference/html/spring-cloud-function.html#_function_routing_and_filtering[Spring Cloud Function Routing section].
+
+More details are available in the provided https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws-routing[sample].
+
+[[performance]]
+== Performance considerations
+
+A core characteristic of Serverless Functions is the ability to scale to zero and handle sudden traffic spikes. To handle requests AWS Lambda spins up https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtime-environment.html[new execution environments]. These environments need to be initialized, your code needs to be downloaded and a JVM + your application needs to start. This is also known as a cold-start. To reduce this cold-start time you can rely on the following mechanisms to optimize performance.
+
+1. Leverage AWS Lambda SnapStart to start your Lambda function from pre-initialized snapshots.
+2. Tune the Memory Configuration via AWS Lambda Power Tuning to find the best tradeoff between performance and cost.
+3. Follow AWS SDK Best Practices such as defining SDK clients outside the handler code or leverage more advanced priming techniques.
+4. Implement additional Spring mechanisms to reduce Spring startup and initialization time such as https://github.com/spring-cloud/spring-cloud-function/blob/main/spring-cloud-function-samples/function-functional-sample-aws/src/main/java/example/FunctionConfiguration.java[functional bean registration].
+
+Please refer to https://aws.amazon.com/blogs/compute/reducing-java-cold-starts-on-aws-lambda-functions-with-snapstart/[the official guidance] for more information.
+
+[[graalvm]]
+== GraalVM Native Image
+
+Spring Cloud Function provides GraalVM Native Image support for functions running on AWS Lambda. Since GraalVM native images do not run on a traditional Java Virtual Machine (JVM) you must deploy your native Spring Cloud Function to an AWS Lambda custom runtime. The most notable difference is that you no longer provide a JAR file but the native-image and a bootstrap file with starting instructions bundled in a zip package:
+
+[source, text]
+----
+lambda-custom-runtime.zip
+ |-- bootstrap
+ |-- function-sample-aws-native
+----
-One of the core features of Spring Cloud Function is https://docs.spring.io/spring-cloud-function/docs/{project-version}/reference/html/spring-cloud-function.html#_function_routing_and_filtering[routing]
-- an ability to have one special function to delegate to other functions based on the user provided routing instructions.
+Bootstrap file:
-In AWS Lambda environment this feature provides one additional benefit, as it allows you to bind a single function (Routing Function)
-as AWS Lambda and thus a single HTTP endpoint for API Gateway. So in the end you only manage one function and one endpoint, while benefiting
-from many function that can be part of your application.
+[source, text]
+----
+#!/bin/sh
-More details are available in the provided https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws-routing[sample],
-yet few general things worth mentioning.
+cd ${LAMBDA_TASK_ROOT:-.}
-Routing capabilities will be enabled by default whenever there is more then one function in your application as `org.springframework.cloud.function.adapter.aws.FunctionInvoker`
-can not determine which function to bind as AWS Lambda, so it defaults to `RoutingFunction`.
-This means that all you need to do is provide routing instructions which you can do https://docs.spring.io/spring-cloud-function/docs/{project-version}/reference/html/spring-cloud-function.html#_function_routing_and_filtering[using several mechanisms]
-(see https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws-routing[sample] for more details).
+./function-sample-aws-native
+----
-Also, note that since AWS does not allow dots `.` and/or hyphens`-` in the name of the environment variable, you can benefit from boot support and simply substitute
-dots with underscores and hyphens with camel case. So for example `spring.cloud.function.definition` becomes `spring_cloud_function_definition`
-and `spring.cloud.function.routing-expression` becomes `spring_cloud_function_routingExpression`.
+You can find https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws-native[a full GraalVM native-image example with Spring Cloud Function on GitHub]. For a deep dive you can also refer to the https://catalog.workshops.aws/java-on-aws-lambda/en-US/02-accelerate/graal-plain-java[GraalVM modules of the Java on AWS Lambda workshop].
[[custom-runtime]]
-=== Custom Runtime
+== Custom Runtime
+
+Lambda focuses on providing stable long-term support (LTS) Java runtime versions. The official Lambda runtimes are built around a combination of operating system, programming language, and software libraries that are subject to maintenance and security updates. For example, the Lambda runtime for Java supports the LTS versions such as Java 17 Corretto and Java 21 Corretto. You can find the full list https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html[here]. There is no provided runtime for non-LTS versions like Java 22, Java 23 or Java 24.
-You can also benefit from https://docs.aws.amazon.com/lambda/latest/dg/runtimes-custom.html[AWS Lambda custom runtime] feature of AWS Lambda
-and Spring Cloud Function provides all the necessary components to make it easy.
+To use other language versions, JVMs or GraalVM native-images, Lambda allows you to https://docs.aws.amazon.com/lambda/latest/dg/runtimes-custom.html[create custom runtimes]. Custom runtimes allow you to provide and configure your own runtimes for running their application code. Spring Cloud Function provides all the necessary components to make it easy.
-From the code perspective the application should look no different then any other Spring Cloud Function application.
-The only thing you need to do is to provide a `bootstrap` script in the root of your zip/jar that runs the Spring Boot application.
+From the code perspective the application should not look different from any other Spring Cloud Function application.
+The only thing you need to do is to provide a `bootstrap` script in the root of your ZIP/ JAR that runs the Spring Boot application.
and select "Custom Runtime" when creating a function in AWS.
Here is an example 'bootstrap' file:
```text
@@ -133,28 +285,29 @@ java -Dspring.main.web-application-type=none -Dspring.jmx.enabled=false \
```
The `com.example.LambdaApplication` represents your application which contains function beans.
-Set the handler name in AWS to the name of your function. You can use function composition here as well (e.g., `uppecrase|reverse`).
-That is pretty much all. Once you upload your zip/jar to AWS your function will run in custom runtime.
-We provide a https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples/function-sample-aws-custom-new[sample project]
-where you can also see how to configure yoru POM to properly generate the zip file.
+Set the handler name in AWS to the name of your function. You can use function composition here as well (e.g., `uppercase|reverse`).
+Once you upload your ZIP/ JAR to AWS your function will run in a custom runtime.
+We provide a https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples/function-sample-aws-custom-new[sample project]
+where you can also see how to configure your POM to properly generate the ZIP file.
-The functional bean definition style works for custom runtimes as well, and is
-faster than the `@Bean` style. A custom runtime can start up much quicker even than a functional bean implementation
-of a Java lambda - it depends mostly on the number of classes you need to load at runtime.
-Spring doesn't do very much here, so you can reduce the cold start time by only using primitive types in your function, for instance,
+The functional bean definition style works for custom runtimes as well, and is
+faster than the `@Bean` style. A custom runtime can start up much quicker even than a functional bean implementation
+of a Java lambda - it depends mostly on the number of classes you need to load at runtime.
+Spring doesn't do very much here, so you can reduce the cold start time by only using primitive types in your function, for instance,
and not doing any work in custom `@PostConstruct` initializers.
[[aws-function-routing-with-custom-runtime]]
=== AWS Function Routing with Custom Runtime
-When using <> Function Routing works the same way. All you need is to specify `functionRouter` as AWS Handler the same way you would use the name of the function as handler.
+When using a <> Function Routing works the same way. All you need is to specify `functionRouter` as AWS Handler the same way you would use the name of the function as handler.
-=== Deploying Container images
+== Deploying Lambda functions as container images
-Custom Runtime is also responsible for handling of container image deployments.
-When deploying container images in a way similar to the one described https://github.com/spring-cloud/spring-cloud-function/issues/1021[here], it is important
+In contrast to JAR or ZIP based deployments you can also deploy your Lambda functions as a container image via an image registry. For additional details please refer to the https://docs.aws.amazon.com/lambda/latest/dg/images-create.html[official AWS Lambda documentation].
+
+When deploying container images in a way similar to the one described https://github.com/spring-cloud/spring-cloud-function/issues/1021[here], it is important
to remember to set and environment variable `DEFAULT_HANDLER` with the name of the function.
For example, for function bean shown below the `DEFAULT_HANDLER` value would be `readMessageFromSQS`.
@@ -166,7 +319,7 @@ public Consumer> readMessageFromSQS() {
}
----
-Also, it is important to remember to ensure tht `spring_cloud_function_web_export_enabled` is also set to `false`. It is by default.
+Also, it is important to remember to ensure that `spring_cloud_function_web_export_enabled` is also set to `false`. It is `true` by default.
[[notes-on-jar-layout]]
== Notes on JAR Layout
@@ -195,7 +348,7 @@ then additional transformers must be configured as part of the maven-shade-plugi
org.springframework.bootspring-boot-maven-plugin
- 2.7.4
+ 3.4.2
@@ -237,7 +390,7 @@ then additional transformers must be configured as part of the maven-shade-plugi
== Build file setup
In order to run Spring Cloud Function applications on AWS Lambda, you can leverage Maven or Gradle
- plugins offered by the cloud platform provider.
+plugins.
[[maven]]
@@ -304,7 +457,7 @@ Below is a complete gradle file
----
plugins {
id 'java'
- id 'org.springframework.boot' version '3.2.0-M2'
+ id 'org.springframework.boot' version '3.4.2'
id 'io.spring.dependency-management' version '1.1.3'
id 'com.github.johnrengelman.shadow' version '8.1.1'
id 'maven-publish'
@@ -325,7 +478,7 @@ repositories {
}
ext {
- set('springCloudVersion', "2023.0.0-M1")
+ set('springCloudVersion', "2024.0.0")
}
assemble.dependsOn = [thinJar, shadowJar]
@@ -389,3 +542,60 @@ tasks.named('test') {
You can find the entire sample `build.gradle` file for deploying Spring Cloud Function
applications to AWS Lambda with Gradle https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws/build.gradle[here].
+
+[[serverless-java-container]]
+== Serverless Java container for Spring Boot Web
+
+You can use the https://github.com/aws/serverless-java-container[aws-serverless-java-container] library to run a Spring Boot 3 applications in AWS Lambda. This is a good fit for migrations of existing Spring applications to AWS Lambda or if you build sophisticated APIs with multiple API endpoints and want to maintain the familiar `RestController` approach. The following section provides a high-level overview of the process. Please refer to the https://github.com/aws/serverless-java-container/wiki/Quick-start---Spring-Boot3[official sample code for additional information].
+
+1. Import the Serverless Java Container library to your existing Spring Boot 3 web app
++
+[source, java]
+----
+
+ com.amazonaws.serverless
+ aws-serverless-java-container-springboot3
+ 2.1.2
+
+----
+
+2. Use the built-in Lambda function handler that serves as an entrypoint
++
+`com.amazonaws.serverless.proxy.spring.SpringDelegatingLambdaContainerHandler`
+
+3. Configure an environment variable named `MAIN_CLASS` to let the generic handler know where to find your original application main class. Usually that is the class annotated with @SpringBootApplication.
+
+`MAIN_CLAS = com.my.package.MySpringBootApplication`
+
+Below you can see an example deployment configuration:
+
+[source, yaml]
+----
+AWSTemplateFormatVersion: '2010-09-09'
+Transform: AWS::Serverless-2016-10-31
+
+Resources:
+ MySpringBootLambdaFunction:
+ Type: AWS::Serverless::Function
+ Properties:
+ Handler: com.amazonaws.serverless.proxy.spring.SpringDelegatingLambdaContainerHandler
+ Runtime: java21
+ MemorySize: 1024
+ CodeUri: target/lambda-spring-boot-app-0.0.1-SNAPSHOT.jar #Must be a shaded Jar
+ Environment:
+ Variables:
+ MAIN_CLASS: com.amazonaws.serverless.sample.springboot3.Application #Class annotated with @SpringBootApplication
+
+----
+
+Please find all the examples including GraalVM native-image https://github.com/aws/serverless-java-container/tree/main/samples/springboot3[here].
+
+
+[[resources]]
+== Additional resources
+
+- https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples[Official Example Repositories on GitHub]
+- https://catalog.workshops.aws/java-on-aws-lambda/en-US/01-migration/architecture-overview[Java on AWS Lambda workshop with dedicated Spring examples]
+- https://catalog.workshops.aws/java-on-aws/en-US[Java on AWS Immersion Day]
+- https://serverlessland.com/content/service/lambda/paved-path/java-replatforming/introduction[Java Replatforming Guide]
+- https://www.youtube.com/watch?v=AFIHug_HujI[Talk: Spring I/O 2024 - Serverless Java with Spring]
diff --git a/docs/modules/ROOT/pages/adapters/aws.adoc b/docs/modules/ROOT/pages/adapters/aws.adoc
deleted file mode 100644
index fe35449d0..000000000
--- a/docs/modules/ROOT/pages/adapters/aws.adoc
+++ /dev/null
@@ -1,155 +0,0 @@
-*{project-version}*
-
-
-The https://aws.amazon.com/[AWS] adapter takes a Spring Cloud Function app and converts it to a form that can run in AWS Lambda.
-
-[[introduction]]
-== Introduction
-
-The details of how to get stared with AWS Lambda is out of scope of this document, so the expectation is that user has some familiarity with
-AWS and AWS Lambda and wants to learn what additional value spring provides.
-
-
-=== Getting Started
-
-One of the goals of Spring Cloud Function framework is to provide necessary infrastructure elements to enable a _simple function application_
-to interact in a certain way in a particular environment.
-A simple function application (in context or Spring) is an application that contains beans of type Supplier, Function or Consumer.
-So, with AWS it means that a simple function bean should somehow be recognised and executed in AWS Lambda environment.
-
-Let’s look at the example:
-
-[source, java]
-----
-@SpringBootApplication
-public class FunctionConfiguration {
-
- public static void main(String[] args) {
- SpringApplication.run(FunctionConfiguration.class, args);
- }
-
- @Bean
- public Function uppercase() {
- return value -> value.toUpperCase();
- }
-}
-----
-
-It shows a complete Spring Boot application with a function bean defined in it. What’s interesting is that on the surface this is just
-another boot app, but in the context of AWS Adapter it is also a perfectly valid AWS Lambda application. No other code or configuration
-is required. All you need to do is package it and deploy it, so let’s look how we can do that.
-
-To make things simpler we’ve provided a sample project ready to be built and deployed and you can access it
-https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples/function-sample-aws[here].
-
-You simply execute `./mvnw clean package` to generate JAR file. All the necessary maven plugins have already been setup to generate
-appropriate AWS deployable JAR file. (You can read more details about JAR layout in <>).
-
-Then you have to upload the JAR file (via AWS dashboard or AWS CLI) to AWS.
-
-When ask about _handler_ you specify `org.springframework.cloud.function.adapter.aws.FunctionInvoker::handleRequest` which is a generic request handler.
-
-image::AWS-deploy.png[width=800,scaledwidth="75%",align="center"]
-
-That is all. Save and execute the function with some sample data which for this function is expected to be a
-String which function will uppercase and return back.
-
-While `org.springframework.cloud.function.adapter.aws.FunctionInvoker` is a general purpose AWS's `RequestHandler` implementation aimed at completely
-isolating you from the specifics of AWS Lambda API, for some cases you may want to specify which specific AWS's `RequestHandler` you want
-to use. The next section will explain you how you can accomplish just that.
-
-
-[[aws-request-handlers]]
-== AWS Request Handlers
-
-The adapter has a couple of generic request handlers that you can use. The most generic is (and the one we used in the Getting Started section)
-is `org.springframework.cloud.function.adapter.aws.FunctionInvoker` which is the implementation of AWS's `RequestStreamHandler`.
-User doesn't need to do anything other then specify it as 'handler' on AWS dashboard when deploying function.
-It will handle most of the case including Kinesis, streaming etc. .
-
-
-If your app has more than one `@Bean` of type `Function` etc. then you can choose the one to use by configuring `spring.cloud.function.definition`
-property or environment variable. The functions are extracted from the Spring Cloud `FunctionCatalog`. In the event you don't specify `spring.cloud.function.definition`
-the framework will attempt to find a default following the search order where it searches first for `Function` then `Consumer` and finally `Supplier`).
-
-[[aws-context]]
-== AWS Context
-
-In a typical implementation of AWS Handler user has access to AWS _context_ object. With function approach you can have the same experience if you need it.
-Upon each invocation the framework will add `aws-context` message header containing the AWS _context_ instance for that particular invocation. So if you need to access it
-you can simply have `Message` as an input parameter to your function and then access `aws-context` from message headers.
-For convenience we provide AWSLambdaUtils.AWS_CONTEXT constant.
-
-
-[[aws-function-routing]]
-== AWS Function Routing
-
-One of the core features of Spring Cloud Function is https://docs.spring.io/spring-cloud-function/docs/{project-version}/reference/html/spring-cloud-function.html#_function_routing_and_filtering[routing]
-- an ability to have one special function to delegate to other functions based on the user provided routing instructions.
-
-In AWS Lambda environment this feature provides one additional benefit, as it allows you to bind a single function (Routing Function)
-as AWS Lambda and thus a single HTTP endpoint for API Gateway. So in the end you only manage one function and one endpoint, while benefiting
-from many function that can be part of your application.
-
-More details are available in the provided https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws-routing[sample],
-yet few general things worth mentioning.
-
-Routing capabilities will be enabled by default whenever there is more then one function in your application as `org.springframework.cloud.function.adapter.aws.FunctionInvoker`
-can not determine which function to bind as AWS Lambda, so it defaults to `RoutingFunction`.
-This means that all you need to do is provide routing instructions which you can do https://docs.spring.io/spring-cloud-function/docs/{project-version}/reference/html/spring-cloud-function.html#_function_routing_and_filtering[using several mechanisms]
-(see https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-aws-routing[sample] for more details).
-
-Also, note that since AWS does not allow dots `.` and/or hyphens`-` in the name of the environment variable, you can benefit from boot support and simply substitute
-dots with underscores and hyphens with camel case. So for example `spring.cloud.function.definition` becomes `spring_cloud_function_definition`
-and `spring.cloud.function.routing-expression` becomes `spring_cloud_function_routingExpression`.
-
-
-[[http-and-api-gateway]]
-== HTTP and API Gateway
-
-AWS has some platform-specific data types, including batching of messages, which is much more efficient than processing each one individually. To make use of these types you can write a function that depends on those types. Or you can rely on Spring to extract the data from the AWS types and convert it to a Spring `Message`. To do this you tell AWS that the function is of a specific generic handler type (depending on the AWS service) and provide a bean of type `Function,Message>`, where `S` and `T` are your business data types. If there is more than one bean of type `Function` you may also need to configure the Spring Boot property `function.name` to be the name of the target bean (e.g. use `FUNCTION_NAME` as an environment variable).
-
-The supported AWS services and generic handler types are listed below:
-
-|===
-| Service | AWS Types | Generic Handler |
-
-| API Gateway | `APIGatewayProxyRequestEvent`, `APIGatewayProxyResponseEvent` | `org.springframework.cloud.function.adapter.aws.SpringBootApiGatewayRequestHandler` |
-| Kinesis | KinesisEvent | org.springframework.cloud.function.adapter.aws.SpringBootKinesisEventHandler |
-|===
-
-
-For example, to deploy behind an API Gateway, use `--handler org.springframework.cloud.function.adapter.aws.SpringBootApiGatewayRequestHandler` in your AWS command line (in via the UI) and define a `@Bean` of type `Function,Message>` where `Foo` and `Bar` are POJO types (the data will be marshalled and unmarshalled by AWS using Jackson).
-
-[[custom-runtime]]
-== Custom Runtime
-
-You can also benefit from https://docs.aws.amazon.com/lambda/latest/dg/runtimes-custom.html[AWS Lambda custom runtime] feature of AWS Lambda
-and Spring Cloud Function provides all the necessary components to make it easy.
-
-From the code perspective the application should look no different then any other Spring Cloud Function application.
-The only thing you need to do is to provide a `bootstrap` script in the root of your zip/jar that runs the Spring Boot application.
-and select "Custom Runtime" when creating a function in AWS.
-Here is an example 'bootstrap' file:
-```text
-#!/bin/sh
-
-cd ${LAMBDA_TASK_ROOT:-.}
-
-java -Dspring.main.web-application-type=none -Dspring.jmx.enabled=false \
- -noverify -XX:TieredStopAtLevel=1 -Xss256K -XX:MaxMetaspaceSize=128M \
- -Djava.security.egd=file:/dev/./urandom \
- -cp .:`echo lib/*.jar | tr ' ' :` com.example.LambdaApplication
-```
-The `com.example.LambdaApplication` represents your application which contains function beans.
-
-Set the handler name in AWS to the name of your function. You can use function composition here as well (e.g., `uppecrase|reverse`).
-That is pretty much all. Once you upload your zip/jar to AWS your function will run in custom runtime.
-We provide a https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples/function-sample-aws-custom-new[sample project]
-where you can also see how to configure yoru POM to properly generate the zip file.
-
-The functional bean definition style works for custom runtimes as well, and is
-faster than the `@Bean` style. A custom runtime can start up much quicker even than a functional bean implementation
-of a Java lambda - it depends mostly on the number of classes you need to load at runtime.
-Spring doesn't do very much here, so you can reduce the cold start time by only using primitive types in your function, for instance,
-and not doing any work in custom `@PostConstruct` initializers.
diff --git a/docs/modules/ROOT/pages/adapters/azure-intro.adoc b/docs/modules/ROOT/pages/adapters/azure-intro.adoc
index c8d0d8610..fa3ac447e 100644
--- a/docs/modules/ROOT/pages/adapters/azure-intro.adoc
+++ b/docs/modules/ROOT/pages/adapters/azure-intro.adoc
@@ -1,6 +1,6 @@
[[microsoft-azure-functions]]
= Microsoft Azure Functions
-
+:page-aliases: adapters/azure.adoc
https://azure.microsoft.com[Azure] function adapter for deploying `Spring Cloud Function` applications as native Azure Java Functions.
@@ -198,7 +198,7 @@ Usually the Azure Maven (or Gradle) plugins are used to generate the necessary c
IMPORTANT: The Azure https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-java?tabs=bash%2Cconsumption#folder-structure[packaging format] is not compatible with the default Spring Boot packaging (e.g. `uber jar`).
The xref:adapters/azure-intro.adoc#disable.spring.boot.plugin[Disable Spring Boot Plugin] section below explains how to handle this.
-[[azure-maven/gradle-plugins]]
+[[azure-maven-gradle-plugins]]
=== Azure Maven/Gradle Plugins
Azure provides https://github.com/microsoft/azure-maven-plugins/tree/develop/azure-functions-maven-plugin[Maven] and https://github.com/microsoft/azure-gradle-plugins/tree/master/azure-functions-gradle-plugin[Gradle] plugins to process the annotated classes, generate the necessary configurations and produce the expected package layout.
diff --git a/docs/modules/ROOT/pages/adapters/azure.adoc b/docs/modules/ROOT/pages/adapters/azure.adoc
deleted file mode 100644
index 490093572..000000000
--- a/docs/modules/ROOT/pages/adapters/azure.adoc
+++ /dev/null
@@ -1,2 +0,0 @@
-*{project-version}*
-
diff --git a/docs/modules/ROOT/pages/adapters/gcp-intro.adoc b/docs/modules/ROOT/pages/adapters/gcp-intro.adoc
index 04480c856..02a92c3d7 100644
--- a/docs/modules/ROOT/pages/adapters/gcp-intro.adoc
+++ b/docs/modules/ROOT/pages/adapters/gcp-intro.adoc
@@ -1,5 +1,6 @@
[[google-cloud-functions]]
= Google Cloud Functions
+:page-aliases: adapters/gcp.adoc
The Google Cloud Functions adapter enables Spring Cloud Function apps to run on the https://cloud.google.com/functions[Google Cloud Functions] serverless platform.
You can either run the function locally using the open source https://github.com/GoogleCloudPlatform/functions-framework-java[Google Functions Framework for Java] or on GCP.
@@ -109,7 +110,7 @@ curl http://localhost:8080/ -d "hello"
----
-== Buikd & Deploy to GCP
+== Build & Deploy to GCP
Start by packaging your application.
diff --git a/docs/modules/ROOT/pages/adapters/gcp.adoc b/docs/modules/ROOT/pages/adapters/gcp.adoc
deleted file mode 100644
index 490093572..000000000
--- a/docs/modules/ROOT/pages/adapters/gcp.adoc
+++ /dev/null
@@ -1,2 +0,0 @@
-*{project-version}*
-
diff --git a/docs/modules/ROOT/pages/spring-cloud-function/deploying-a-packaged.adoc b/docs/modules/ROOT/pages/spring-cloud-function/deploying-a-packaged.adoc
index a3b6ce4a1..932d502b7 100644
--- a/docs/modules/ROOT/pages/spring-cloud-function/deploying-a-packaged.adoc
+++ b/docs/modules/ROOT/pages/spring-cloud-function/deploying-a-packaged.adoc
@@ -19,7 +19,7 @@ the functions. It can optionally use a `maven:` prefix to locate the artifact vi
for complete details). A Spring Boot application is bootstrapped from the jar file, using the `MANIFEST.MF` to locate a start class, so
that a standard Spring Boot fat jar works well, for example. If the target jar can be launched successfully then the result is a function
registered in the main application's `FunctionCatalog`. The registered function can be applied by code in the main application, even though
-it was created in an isolated class loader (by deault).
+it was created in an isolated class loader (by default).
Here is the example of deploying a JAR which contains an 'uppercase' function and invoking it .
diff --git a/docs/modules/ROOT/pages/spring-cloud-function/programming-model.adoc b/docs/modules/ROOT/pages/spring-cloud-function/programming-model.adoc
index 9b5df6350..f371dc11a 100644
--- a/docs/modules/ROOT/pages/spring-cloud-function/programming-model.adoc
+++ b/docs/modules/ROOT/pages/spring-cloud-function/programming-model.adoc
@@ -2,67 +2,71 @@
= Programming model
[[function.catalog]]
-
[[function-catalog-and-flexible-function-signatures]]
== Function Catalog and Flexible Function Signatures
-One of the main features of Spring Cloud Function is to adapt and support a range of type signatures for user-defined functions,
-while providing a consistent execution model.
-That's why all user defined functions are transformed into a canonical representation by `FunctionCatalog`.
+One of the main features of Spring Cloud Function is to adapt and support a range of type signatures for user-defined functions, while providing a consistent execution model.
+That's why all user-defined functions are transformed into a canonical representation by `FunctionCatalog`.
-While users don't normally have to care about the `FunctionCatalog` at all, it is useful to know what
-kind of functions are supported in user code.
+While users don't normally have to care about the `FunctionCatalog` at all, it is useful to know what kind of functions are supported in user code.
-It is also important to understand that Spring Cloud Function provides first class support for reactive API
-provided by https://projectreactor.io/[Project Reactor] allowing reactive primitives such as `Mono` and `Flux`
-to be used as types in user defined functions providing greater flexibility when choosing programming model for
-your function implementation.
-Reactive programming model also enables functional support for features that would be otherwise difficult to impossible to implement
-using imperative programming style. For more on this please read <> section.
+It is also important to understand that Spring Cloud Function provides first-class support for reactive APIs, provided by https://projectreactor.io/[Project Reactor].
+This allows reactive primitives such as `Mono` and `Flux` to be used as types in user-defined functions thereby providing greater flexibility when choosing a programming model for your function implementation.
+A reactive programming model also enables functional support for features that would be otherwise difficult or impossible to implement using an imperative programming style.
+For more on this, please read the section on <>.
[[java-8-function-support]]
== Java 8 function support
-Spring Cloud Function embraces and builds on top of the 3 core functional interfaces defined by Java
-and available to us since Java 8.
+Spring Cloud Function embraces and builds on top of the 3 core functional interfaces defined by Java since Java 8.
- Supplier
- Function
- Consumer
-To avoid constantly mentioning `Supplier`, `Function` and `Consumer` we’ll refer to them a Functional beans for the rest of this manual where appropriate.
+To constantly avoid mentioning `Supplier`, `Function` and `Consumer`, we’ll refer to them as Functional beans where appropriate for the rest of this manual.
-In a nutshell, any bean in your Application Context that is Functional bean will lazily be registered with `FunctionCatalog`.
+In a nutshell, any bean in your `ApplicationContext` that is a Functional bean will be lazily registered with `FunctionCatalog`.
This means that it could benefit from all of the additional features described in this reference manual.
-In a simplest of application all you need to do is to declare `@Bean` of type `Supplier`, `Function` or `Consumer` in your application configuration.
-Then you can access `FunctionCatalog` and lookup a particular function based on its name.
+In the simplest application, all you need to do is to declare a `@Bean` of type `Supplier`, `Function` or `Consumer` in your application configuration.
+Then, you can use `FunctionCatalog` to lookup a particular function based on its name.
For example:
-
-[source, test]
+[source, java]
----
@Bean
public Function uppercase() {
return value -> value.toUpperCase();
}
-. . .
+// . . .
FunctionCatalog catalog = applicationContext.getBean(FunctionCatalog.class);
Function uppercase = catalog.lookup(“uppercase”);
----
-Important to understand that given that `uppercase` is a bean, you can certainly get it form the `ApplicationContext` directly, but all you will get is just your bean as you declared it without any extra features provided by SCF. When you do lookup of a function via `FunctionCatalog`, the instance you will receive is wrapped (instrumented) with additional features (i.e., type conversion, composition etc.) described in this manual. Also, it is important to understand that a typical user does not use Spring Cloud Function directly. Instead a typical user implements Java `Function/Supplier/Consumer` with the idea of using it in different execution contexts without additional work. For example the same java function could be represented as _REST endpoint_ or _Streaming message handler_ or _AWS Lambda_ and more via Spring Cloud Function provided
-adapters as well as other frameworks using Spring Cloud Function as the core programming model (e.g., https://spring.io/projects/spring-cloud-stream[Spring Cloud Stream])
-So in summary Spring Cloud Function instruments java functions with additional features to be utilised in variety of execution contexts.
+It is important to understand that given `uppercase` is a bean, you can certainly get it form the `ApplicationContext` directly, but all you will get is just your bean as you declared it without any extra features provided by SCF.
+When you look up a function via `FunctionCatalog`, the instance you receive is wrapped (instrumented) with additional features (i.e., type conversion, composition, etc.) described in this manual.
+
+Also, it is important to understand that a typical user does not use Spring Cloud Function directly.
+Instead, a typical user implements a Java `Function`, `Supplier`, or `Consumer` with the idea of using it in different execution contexts without additional work.
+For example, the same Java function could be represented as a _REST endpoint_, a _Streaming message handler_, or an _AWS Lambda_, and even more, via Spring Cloud Function provided adapters as well as other frameworks using Spring Cloud Function as the core programming model (e.g. https://spring.io/projects/spring-cloud-stream[Spring Cloud Stream]).
+
+In summary, Spring Cloud Function instruments Java functions with additional features to be utilized in variety of execution contexts.
[[function-definition]]
=== Function definition
-While the previous example shows you how to lookup function in FunctionCatalog programmatically, in a typical integration case where Spring Cloud Function used as programming model by another framework (e.fg. Spring Cloud Stream), you declare which functions to use via `spring.cloud.function.definition` property. Knowing that it is important to understand some default behaviour when it comes to discovering functions in `FunctionCatalog`. For example, if you only have one Functional bean in your `ApplicationContext`, the `spring.cloud.function.definition` property typically will not be required, since a single function in `FunctionCatalog` can be looked up by an empty name or any name. For example, assuming that `uppercase` is the only function in your catalog, it can be looked up as `catalog.lookup(null)`, `catalog.lookup(“”)`, `catalog.lookup(“foo”)`
-That said, for cases where you are using framework such as Spring Cloud Stream which uses `spring.cloud.function.definition` it is best practice and recommended to always use `spring.cloud.function.definition` property.
+
+While the previous example shows you how to lookup a function in `FunctionCatalog` programmatically, in a typical integration case where Spring Cloud Function is used as the programming model by another framework (e.g. https://spring.io/projects/spring-cloud-stream[Spring Cloud Stream]), you can declare which functions to use via the `spring.cloud.function.definition` property.
+It is important to know and understand the default behaviour when it comes to discovering functions in `FunctionCatalog`.
+
+For instance, if you only have one Functional bean in your `ApplicationContext`, the `spring.cloud.function.definition` property typically will not be required since a single function in `FunctionCatalog` can be looked up by an empty name, or any name.
+For example, assuming that `uppercase` is the only function in your catalog, it can be looked up as `catalog.lookup(null)`, `catalog.lookup(“”)`, `catalog.lookup(“foo”)`.
+
+That said, for cases where you are using a framework such as Spring Cloud Stream, which uses `spring.cloud.function.definition`, it is recommended to always use the `spring.cloud.function.definition` property.
For example,
@@ -73,11 +77,12 @@ spring.cloud.function.definition=uppercase
[[filtering-ineligible-functions]]
=== Filtering ineligible functions
-A typical Application Context may include beans that are valid java functions, but not intended to be candidates to be registered with `FunctionCatalog`.
-Such beans could be auto-configurations from other projects or any other beans that qualify to be Java functions.
-The framework provides default filtering of known beans that should not be candidates for registration with function catalog.
-You can also add to this list additional beans by providing coma delimited list of bean definition names using
-`spring.cloud.function.ineligible-definitions` property.
+
+A typical `ApplicationContext` may include beans that are valid Java functions, but not intended as candidates to be registered with `FunctionCatalog`.
+Such beans could be auto-configurations from other projects or any other bean that qualifies as a Java function.
+
+The framework provides default filtering of known beans that should not be candidates for registration with `FunctionCatalog`.
+You can also add additional beans to this list by providing a comma-delimited list of bean definition names using the `spring.cloud.function.ineligible-definitions` property.
For example,
@@ -88,24 +93,22 @@ spring.cloud.function.ineligible-definitions=foo,bar
[[supplier]]
=== Supplier
-Supplier can be _reactive_ - `Supplier>`
-or _imperative_ - `Supplier`. From the invocation standpoint this should make no difference
-to the implementor of such Supplier. However, when used within frameworks
-(e.g., https://spring.io/projects/spring-cloud-stream[Spring Cloud Stream]), Suppliers, especially reactive,
-often used to represent the source of the stream, therefore they are invoked once to get the stream (e.g., Flux)
-to which consumers can subscribe to. In other words such suppliers represent an equivalent of an _infinite stream_.
-However, the same reactive suppliers can also represent _finite_ stream(s) (e.g., result set on the polled JDBC data).
-In those cases such reactive suppliers must be hooked up to some polling mechanism of the underlying framework.
+Supplier can be _reactive_ - `Supplier>` or _imperative_ - `Supplier`.
+From an invocation standpoint, this should make no difference to the implementor of such a `Supplier`.
+
+However, when used within frameworks (e.g. https://spring.io/projects/spring-cloud-stream[Spring Cloud Stream]), Suppliers, especially reactive, are often used to represent the source of a stream.
+Therefore, they are invoked once to get the stream (e.g. `Flux`) to which consumers can subscribe.
+In other words, such suppliers represent an equivalent of an _infinite stream_.
+
+Although, the same reactive suppliers can also represent a _finite_ stream (e.g. result set on polled JDBC data).
+In those cases, such reactive suppliers must be hooked up to some polling mechanism of the underlying framework.
-To assist with that Spring Cloud Function provides a marker annotation
-`org.springframework.cloud.function.context.PollableBean` to signal that such supplier produces a
-finite stream and may need to be polled again. That said, it is important to understand that Spring Cloud Function itself
-provides no behavior for this annotation.
+To assist with that Spring Cloud Function provides a marker annotation `org.springframework.cloud.function.context.PollableBean` to signal that such supplier produces a finite stream and may need to be polled again.
+However, it is important to understand that Spring Cloud Function itself provides no behavior for this annotation.
-In addition `PollableBean` annotation exposes a _splittable_ attribute to signal that produced stream
-needs to be split (see https://www.enterpriseintegrationpatterns.com/patterns/messaging/Sequencer.html[Splitter EIP])
+In addition, the `PollableBean` annotation exposes a _splittable_ attribute to signal that the produced stream needs to be split (see https://www.enterpriseintegrationpatterns.com/patterns/messaging/Sequencer.html[Splitter EIP])
-Here is the example:
+Here is an example:
[source, java]
----
@@ -122,70 +125,102 @@ public Supplier> someSupplier() {
[[function]]
=== Function
-Function can also be written in imperative or reactive way, yet unlike Supplier and Consumer there are
-no special considerations for the implementor other then understanding that when used within frameworks
-such as https://spring.io/projects/spring-cloud-stream[Spring Cloud Stream] and others, reactive function is
-invoked only once to pass a reference to the stream (Flux or Mono) and imperative is invoked once per event.
+
+Functions can also be written in an imperative or reactive way.
+Yet, unlike `Supplier` and `Consumer`, there are no special considerations for the implementor other then understanding that when used within frameworks, such as https://spring.io/projects/spring-cloud-stream[Spring Cloud Stream], a reactive function is invoked only once to pass a reference to the stream (i.e. `Flux` or `Mono`) whereas an imperative function is invoked once per event.
+
+[source, java]
+----
+public Function uppercase() {
+ . . . .
+}
+----
+
+[[bifunction]]
+=== BiFunction
+
+In the event you need to receive some additional data (metadata) with your payload, you can always declare your function signature to receive a `Message` containing a map of headers with additional information.
+
+[source, java]
+----
+public Function, String> uppercase() {
+ . . . .
+}
+----
+
+To make your function signature a bit lighter and more POJO-like, there is another approach. You can use `BiFunction`.
+
+[source, java]
+----
+public BiFunction uppercase() {
+ . . . .
+}
+----
+
+Given that a `Message` only contains two attributes (payload and headers), and a `BiFunction` requires two input parameters, the framework will automatically recognise this signature and extract the payload from the `Message` passing it as a first argument and a `Map` of headers as the second.
+As a result, your function is not coupled to Spring’s messaging API.
+Keep in mind that `BiFunction` requires a strict signature where the second argument *must* be a `Map`.
+The same rule applies to `BiConsumer`.
[[consumer]]
=== Consumer
-Consumer is a little bit special because it has a `void` return type,
-which implies blocking, at least potentially. Most likely you will not
-need to write `Consumer>`, but if you do need to do that,
-remember to subscribe to the input flux.
+
+Consumer is a little bit special because it has a `void` return type, which implies blocking, at least potentially.
+Most likely you will not need to write `Consumer>`, but if you do need to do that, remember to subscribe to the input `Flux`.
[[function-composition]]
== Function Composition
+
Function Composition is a feature that allows one to compose several functions into one.
-The core support is based on function composition feature available with https://docs.oracle.com/javase/8/docs/api/java/util/function/Function.html#andThen-java.util.function.Function-[Function.andThen(..)]
-support available since Java 8. However on top of it, we provide few additional features.
+The core support is based on the function composition feature provided by https://docs.oracle.com/javase/8/docs/api/java/util/function/Function.html#andThen-java.util.function.Function-[Function.andThen(..)], available since Java 8.
+However, Spring Cloud Function provides a few additional features on top of this.
[[declarative-function-composition]]
=== Declarative Function Composition
-This feature allows you to provide composition instruction in a declarative way using `|` (pipe) or `,` (comma) delimiter
-when providing `spring.cloud.function.definition` property.
+This feature allows you to provide composition instructions in a declarative way using `|` (pipe) or `,` (comma) delimiters when setting the `spring.cloud.function.definition` property.
-For example
+For example:
----
--spring.cloud.function.definition=uppercase|reverse
----
-Here we effectively provided a definition of a single function which itself is a composition of
-function `uppercase` and function `reverse`. In fact that is one of the reasons why the property name is _definition_ and not _name_,
-since the definition of a function can be a composition of several named functions.
-And as mentioned you can use `,` instead of pipe (such as `...definition=uppercase,reverse`).
+
+Here, we effectively provided a definition of a single function which itself is a composition of function `uppercase` and function `reverse`.
+In fact, that is one of the reasons why the property name is _definition_ and not _name_, since the definition of a function can be a composition of several named functions.
+As mentioned, you can use `,` instead of `|`, such as `...definition=uppercase,reverse`.
[[composing-non-functions]]
=== Composing non-Functions
-Spring Cloud Function also supports composing Supplier with `Consumer` or `Function` as well as `Function` with `Consumer`.
-What's important here is to understand the end product of such definitions.
-Composing Supplier with Function still results in Supplier while composing Supplier with Consumer will effectively render Runnable.
-Following the same logic composing Function with Consumer will result in Consumer.
-And of course you can't compose uncomposable such as Consumer and Function, Consumer and Supplier etc.
+Spring Cloud Function also supports composing `Supplier` with `Consumer` or `Function` as well as `Function` with `Consumer`.
+What's important to understand is the end product of such definitions.
+Composing `Supplier` with `Function` still results in `Supplier` while composing `Supplier` with `Consumer` will effectively render `Runnable`.
+Following the same logic, composing `Function` with `Consumer` will result in `Consumer`.
+
+And, of course, you can't compose uncomposable objects such as `Consumer` and `Function`, `Consumer` and `Supplier`, etc.
[[function-routing-and-filtering]]
== Function Routing and Filtering
-Since version 2.2 Spring Cloud Function provides routing feature allowing
-you to invoke a single function which acts as a router to an actual function you wish to invoke
-This feature is very useful in certain FAAS environments where maintaining configurations
-for several functions could be cumbersome or exposing more than one function is not possible.
+Since version 2.2, Spring Cloud Function provides a routing feature allowing you to invoke a single function, which acts as a router to an actual function you wish to invoke.
+This feature is very useful in certain FAAS environments where maintaining configurations for several functions could be cumbersome or exposing more than one function is not possible.
-The `RoutingFunction` is registered in _FunctionCatalog_ under the name `functionRouter`. For simplicity
-and consistency you can also refer to `RoutingFunction.FUNCTION_NAME` constant.
+The `RoutingFunction` is registered in _FunctionCatalog_ under the name `functionRouter`.
+For simplicity and consistency, you can also refer to the `RoutingFunction.FUNCTION_NAME` constant.
This function has the following signature:
[source, java]
----
public class RoutingFunction implements Function