You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Alternatively, you can build a Rust-based Lambda function declaratively using the [Serverless framework Rust plugin](https://github.com/softprops/serverless-rust).
173
173
174
-
A number of getting started Serverless application templates exist to get you up and running quickly
174
+
A number of getting started Serverless application templates exist to get you up and running quickly:
175
175
176
176
- a minimal [echo function](https://github.com/softprops/serverless-aws-rust) to demonstrate what the smallest Rust function setup looks like
177
177
- a minimal [http function](https://github.com/softprops/serverless-aws-rust-http) to demonstrate how to interface with API Gateway using Rust's native [http](https://crates.io/crates/http) crate (note this will be a git dependency until 0.2 is published)
@@ -188,7 +188,7 @@ $ npx serverless install \
188
188
&& npm install --silent
189
189
```
190
190
191
-
Deploy it using the standard serverless workflow
191
+
Deploy it using the standard serverless workflow:
192
192
193
193
```bash
194
194
# build, package, and deploy service to aws lambda
Alternatively, you can build a Rust-based Lambda function in a [docker mirror of the AWS Lambda provided runtime with the Rust toolchain preinstalled](https://github.com/rust-serverless/lambda-rust).
207
207
208
-
Running the following command will start a ephemeral docker container which will build your Rust application and produce a zip file containing its binary auto-renamed to `bootstrap` to meet the AWS Lambda's expectations for binaries under `target/lambda_runtime/release/{your-binary-name}.zip`, typically this is just the name of your crate if you are using the cargo default binary (i.e. `main.rs`)
208
+
Running the following command will start an ephemeral docker container, which will build your Rust application and produce a zip file containing its binary auto-renamed to `bootstrap` to meet the AWS Lambda's expectations for binaries under `target/lambda_runtime/release/{your-binary-name}.zip`. Typically, this is just the name of your crate if you are using the cargo default binary (i.e. `main.rs`):
209
209
210
210
```bash
211
211
# build and package deploy-ready artifact
@@ -216,7 +216,7 @@ $ docker run --rm \
216
216
rustserverless/lambda-rust
217
217
```
218
218
219
-
With your application build and packaged, it's ready to ship to production. You can also invoke it locally to verify is behavior using the [lambdaci:provided docker container](https://hub.docker.com/r/lambci/lambda/) which is also a mirror of the AWS Lambda provided runtime with build dependencies omitted.
219
+
With your application build and packaged, it's ready to ship to production. You can also invoke it locally to verify is behavior using the [lambci:provided docker container](https://hub.docker.com/r/lambci/lambda/), which is also a mirror of the AWS Lambda provided runtime with build dependencies omitted:
220
220
221
221
```bash
222
222
# start a docker container replicating the "provided" lambda runtime
@@ -245,7 +245,7 @@ You can read more about how [cargo lambda start](https://github.com/calavera/car
245
245
246
246
### Lambda Debug Proxy
247
247
248
-
Lambdas can be run and debugged locally using a special [Lambda debug proxy](https://github.com/rimutaka/lambda-debug-proxy) (a non-AWS repo maintained by @rimutaka), which is a Lambda function that forwards incoming requests to one AWS SQS queue and reads responses from another queue. A local proxy running on your development computer reads the queue, calls your lambda locally and sends back the response. This approach allows debugging of Lambda functions locally while being part of your AWS workflow. The lambda handler code does not need to be modified between the local and AWS versions.
248
+
Lambdas can be run and debugged locally using a special [Lambda debug proxy](https://github.com/rimutaka/lambda-debug-proxy) (a non-AWS repo maintained by @rimutaka), which is a Lambda function that forwards incoming requests to one AWS SQS queue and reads responses from another queue. A local proxy running on your development computer reads the queue, calls your Lambda locally and sends back the response. This approach allows debugging of Lambda functions locally while being part of your AWS workflow. The Lambda handler code does not need to be modified between the local and AWS versions.
249
249
250
250
## `lambda_runtime`
251
251
@@ -259,7 +259,7 @@ This project does not currently include Lambda event struct definitions. Instead
259
259
260
260
### Custom event objects
261
261
262
-
To serialize and deserialize events and responses, we suggest using the use the [`serde`](https://github.com/serde-rs/serde) library. To receive custom events, annotate your structure with Serde's macros:
262
+
To serialize and deserialize events and responses, we suggest using the [`serde`](https://github.com/serde-rs/serde) library. To receive custom events, annotate your structure with Serde's macros:
0 commit comments