Skip to content

Commit 22f3941

Browse files
authored
Update README to fix various typos (#485)
1 parent 76d26b9 commit 22f3941

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -171,7 +171,7 @@ $ cat output.json # Prints: {"msg": "Command Say Hi! executed."}
171171

172172
Alternatively, you can build a Rust-based Lambda function declaratively using the [Serverless framework Rust plugin](https://github.com/softprops/serverless-rust).
173173

174-
A number of getting started Serverless application templates exist to get you up and running quickly
174+
A number of getting started Serverless application templates exist to get you up and running quickly:
175175

176176
- a minimal [echo function](https://github.com/softprops/serverless-aws-rust) to demonstrate what the smallest Rust function setup looks like
177177
- a minimal [http function](https://github.com/softprops/serverless-aws-rust-http) to demonstrate how to interface with API Gateway using Rust's native [http](https://crates.io/crates/http) crate (note this will be a git dependency until 0.2 is published)
@@ -188,7 +188,7 @@ $ npx serverless install \
188188
&& npm install --silent
189189
```
190190

191-
Deploy it using the standard serverless workflow
191+
Deploy it using the standard serverless workflow:
192192

193193
```bash
194194
# build, package, and deploy service to aws lambda
@@ -205,7 +205,7 @@ $ npx serverless invoke -f hello -d '{"foo":"bar"}'
205205

206206
Alternatively, you can build a Rust-based Lambda function in a [docker mirror of the AWS Lambda provided runtime with the Rust toolchain preinstalled](https://github.com/rust-serverless/lambda-rust).
207207

208-
Running the following command will start a ephemeral docker container which will build your Rust application and produce a zip file containing its binary auto-renamed to `bootstrap` to meet the AWS Lambda's expectations for binaries under `target/lambda_runtime/release/{your-binary-name}.zip`, typically this is just the name of your crate if you are using the cargo default binary (i.e. `main.rs`)
208+
Running the following command will start an ephemeral docker container, which will build your Rust application and produce a zip file containing its binary auto-renamed to `bootstrap` to meet the AWS Lambda's expectations for binaries under `target/lambda_runtime/release/{your-binary-name}.zip`. Typically, this is just the name of your crate if you are using the cargo default binary (i.e. `main.rs`):
209209

210210
```bash
211211
# build and package deploy-ready artifact
@@ -216,7 +216,7 @@ $ docker run --rm \
216216
rustserverless/lambda-rust
217217
```
218218

219-
With your application build and packaged, it's ready to ship to production. You can also invoke it locally to verify is behavior using the [lambdaci :provided docker container](https://hub.docker.com/r/lambci/lambda/) which is also a mirror of the AWS Lambda provided runtime with build dependencies omitted.
219+
With your application build and packaged, it's ready to ship to production. You can also invoke it locally to verify is behavior using the [lambci :provided docker container](https://hub.docker.com/r/lambci/lambda/), which is also a mirror of the AWS Lambda provided runtime with build dependencies omitted:
220220

221221
```bash
222222
# start a docker container replicating the "provided" lambda runtime
@@ -245,7 +245,7 @@ You can read more about how [cargo lambda start](https://github.com/calavera/car
245245

246246
### Lambda Debug Proxy
247247

248-
Lambdas can be run and debugged locally using a special [Lambda debug proxy](https://github.com/rimutaka/lambda-debug-proxy) (a non-AWS repo maintained by @rimutaka), which is a Lambda function that forwards incoming requests to one AWS SQS queue and reads responses from another queue. A local proxy running on your development computer reads the queue, calls your lambda locally and sends back the response. This approach allows debugging of Lambda functions locally while being part of your AWS workflow. The lambda handler code does not need to be modified between the local and AWS versions.
248+
Lambdas can be run and debugged locally using a special [Lambda debug proxy](https://github.com/rimutaka/lambda-debug-proxy) (a non-AWS repo maintained by @rimutaka), which is a Lambda function that forwards incoming requests to one AWS SQS queue and reads responses from another queue. A local proxy running on your development computer reads the queue, calls your Lambda locally and sends back the response. This approach allows debugging of Lambda functions locally while being part of your AWS workflow. The Lambda handler code does not need to be modified between the local and AWS versions.
249249

250250
## `lambda_runtime`
251251

@@ -259,7 +259,7 @@ This project does not currently include Lambda event struct definitions. Instead
259259

260260
### Custom event objects
261261

262-
To serialize and deserialize events and responses, we suggest using the use the [`serde`](https://github.com/serde-rs/serde) library. To receive custom events, annotate your structure with Serde's macros:
262+
To serialize and deserialize events and responses, we suggest using the [`serde`](https://github.com/serde-rs/serde) library. To receive custom events, annotate your structure with Serde's macros:
263263

264264
```rust
265265
use serde::{Serialize, Deserialize};

0 commit comments

Comments
 (0)