finshed ch17.1
Some checks failed
Test Gitea Actions / first (push) Successful in 16s
Test Gitea Actions / check-code (push) Failing after 15s
Test Gitea Actions / test (push) Has been skipped
Test Gitea Actions / documentation-check (push) Has been skipped

This commit is contained in:
darkicewolf50 2025-03-19 16:16:19 -06:00
parent 20d58f8ef5
commit 8f01c0e0e4
8 changed files with 2597 additions and 19 deletions

1
Applying Async.md Normal file
View File

@ -0,0 +1 @@
# Applying Concurrency with Async

View File

@ -1,13 +1,15 @@
# Fundamentals of Asynchronous Programming: Async, Await, Futures, and Streams
Many of the operations we ask computers to do can take a while to finish.
While waiting for that task it would be nice to do something else while we are waiting for those long-running processes to complete.
Modern computers offer two techniques while working on more than one operation at a time:
- Parallelism
- Concurrency
Once we start writing programs that involve parallel or concurrent operations, we quickly encounter new challenges inherit to *asynchronous programming*.
Once we start writing programs that involve parallel or concurrent operations, we quickly encounter new challenges inherit to _asynchronous programming_.
This is where operations may not finish sequentially in the order they were started.
@ -21,7 +23,7 @@ Say you are exporting a video you created of a family celebration, an operation
The video export will use as many resources as available on the GPU and CPU.
If you only had one CPU core and your operating system didn't pause that export until it completed (it executed *synchronously*), you wouldn't be able to do anything else on the computer while that task was running.
If you only had one CPU core and your operating system didn't pause that export until it completed (it executed _synchronously_), you wouldn't be able to do anything else on the computer while that task was running.
This would be incredibly frustrating experience.
@ -35,11 +37,11 @@ While you can start reading the data once it starts to arrive, it may take some
Even once the data us all present, if the video is quite large, it could take at least a second or two to load it all.
The video export is an example of a *CPU-bound* or *compute-bound* operation.
The video export is an example of a _CPU-bound_ or _compute-bound_ operation.
It is limited by the computer's potential data processing speed within the CPU or GPU, and how much speed it can dedicate to the operation.
The video download is an example of a *IO-bound* operation because it is limited by the speed of the computer's *input and output*; it can only go as fast as the data can be sent across the network.
The video download is an example of a _IO-bound_ operation because it is limited by the speed of the computer's _input and output_; it can only go as fast as the data can be sent across the network.
In both of these examples, the operating system's invisible interrupts provide a form of concurrency.
@ -49,13 +51,13 @@ In many cases, because we understand our programs at a much more granular level
Lets say we are building a tool to manage file downloads, we should be able to write our program so that starting one download won't lock up the UI, and users should be able to start multiple downloads at the same time.
Many operating system API's for interacting with the network are *blocking*; that is they block the program's progress until the data they are processing is completely ready.
Many operating system API's for interacting with the network are _blocking_; that is they block the program's progress until the data they are processing is completely ready.
Note: This is how *most* functions calls work.
Note: This is how _most_ functions calls work.
However, the term *blocking* is usually reserved for function calls that interact with files, the network, or other computer resources.
However, the term _blocking_ is usually reserved for function calls that interact with files, the network, or other computer resources.
Due to these cases where an individual program would benefit from the operation being *non*-blocking.
Due to these cases where an individual program would benefit from the operation being _non_-blocking.
We could avoid blocking our main thread by spawning a dedicated thread to download each file.
@ -64,20 +66,24 @@ The overhead of those threads would eventually become a problem.
It would be preferable if the call didn't block it in the first place, it would be better if we could write in the same style we use in blocking code.
This would be a similar case/code
```rust
let data = fetch_data_from(url).await;
println!("{data}");
```
This is example what Rust's async (short for *asynchronous*) abstraction gives us.
This is example what Rust's async (short for _asynchronous_) abstraction gives us.
This chapter we will learn about:
- Futures and Async Syntax [Section Link Here](./Futures%20and%20Async.md)
- How to use Rust's `async` and `await` syntax [Section Link Here]()
- How to use Rust's `async` and `await` syntax [Section Link Here](./Applying%20Async.md)
- How to use the async model to solve some of the same challenges we looked at in Ch 16 [Section Link Here]()
- How multithreading and async provide complementary solutions, that you can combine in many cases [Section Link Here]()
Before jumping into how async works in practice, we need o take a short detour to discuss the differences between parallelism and concurrency.
## Parallelism and Concurrency
So far we treated parallelism and concurrency as mostly interchangeable so far.
Now we need to distinguish between them more precisely, because the differences will now start to show up.
@ -86,7 +92,7 @@ Consider the different ways a team could split up work on a software project.
You could assign a single member multiple tasks assign each member one task or use a mix of the two approaches.
When an individual works on several different tasks before any of them is complete, this is *concurrency*.
When an individual works on several different tasks before any of them is complete, this is _concurrency_.
Or maybe you have two different projects checked out on your computer and when you get bored or stuck on one project, you switch to the other.
@ -94,17 +100,17 @@ As one person, so you can't make progress on both tasks at the exact same time,
Here is a picture of this
<img src="https://doc.rust-lang.org/book/img/trpl17-01.svg" />
When the team instead splits up a group of tasks by having each member take one task and work on it alone, this is *parallelism*
When the team instead splits up a group of tasks by having each member take one task and work on it alone, this is _parallelism_
Each person on the team can make progress at the exact same time.
<img src="https://doc.rust-lang.org/book/img/trpl17-02.svg" />
In both of these workflows, you might have to coordinate between different tasks.
Maybe it was *thought* that the task was totally independent from every other.
Maybe it was _thought_ that the task was totally independent from every other.
It actually requires another person on the team to finish their task first..
Some of this work can be done in parallel, but some of it actually was *serial*: it could only happen in a series, one task after another.
Some of this work can be done in parallel, but some of it actually was _serial_: it could only happen in a series, one task after another.
Here is a diagram of this
<img src="https://doc.rust-lang.org/book/img/trpl17-03.svg" />

View File

@ -1 +1,417 @@
# Futures and the Async Syntax
The key parts of asynchronous programming in Rust are _futures_ and Rust's `async` and `await` keywords
A _future_ or a promise is a value that may not be ready now but will become ready at some point in the future
In other lnaguages the same concept shows up under other names such as _task_ or _promise_.
Rust provides a `Future` trait as a building block so that different async operations can be implemented with different data structures but with a common inferface.
Rust futures are types that implement the `Future` trait.
Each future holds its own information about the progress tha has been made and what "ready" means.
You can apply the `async` keyword to blocks and functions to specify that they can be interrupted and resumed.
Within an async block or async funcion, you can use the `await` keyword to _await a future_ (that is, wait for it to become ready).
Any point where you await a future within an async or function is a potential spot for that async block or function to pause and resume.
The process of checking with a future to see if its value is avaialable yet is called _polling_.
Some other languages, uch as C# and JavaScript, that also use `async` and `await` keyowrds for async programming.
There are some significant differences in how Rust does things, including how it handles the syntax.
When writing async Rust, we use the `async` and `await` keywords most of the time.
Rust compiles them into equivalent code using the `Future` trait, much as it compiles `for` loops into equivalent code using the `Iterator` trait.
Due to Rust providing the `Future` trait, this means that you can also implement it for your own data types you need to.
Many of the functions that we will see have return types with their own implmentations of `Future`.
We will return to the definition of the trait at the end of the chapeter and dig into more of how it works.
This may all feel a bit abstract so we will go into our first program: a little web scraper.
We will pass in two urls form the command line, fetch both of them concurrently and reutrn the result of whichever one fiinihes first.
This will have a fair bit of new syntax.
## Our First Async Program
To keep focus on learning async rather than juggling parts of the ecosystem, we created the `trpl` crate (this is sort for "The Rust Programming Language").
This re-exports all the types, traits, and functions you will need, primarily form the [`futures`](https://crates.io/crates/futures) and [`tokio`](https://tokio.rs/) crates.
The `futures` crate is an official home for Rust experimentation for async code, and it is where the `Future` trait was orignially designed.
Tokio is the most widely used `async` runtime in Rust today, especially for web applications.
Here we the `tokio` crate under the hood for `trpl` becuase it is well tested and widly used.
In some cases `trpl` also renames or wraps the original APIs to keep us focused on the details relevant to this chapter.
If you want to undestand in depth of what the crate does, check out [its source code](https://github.com/rust-lang/book/tree/main/packages/trpl).
You will then be albe to see what crate each re-export comes from, and we have left extensive comments explaining what the crate does.
First we will start by building a little command line tools that fetches two we pages, pulls the `<title>` element from each and print out the title of whichever page finishes that whole process first.
## Defining the `page_title` Function
First we will start by writing a function that takes one page URL as a parameter
```rust
extern crate trpl; // required for mdbook test
fn main() {
// TODO: we'll add this next!
}
use trpl::Html;
async fn page_title(url: &str) -> Option<String> {
let response = trpl::get(url).await;
let response_text = response.text().await;
Html::parse(&response_text)
.select_first("title")
.map(|title_element| title_element.inner_html())
}
```
Fist we define a fnction named `page_title` and mark it with the `async` keyword.
We then ise the `trpl::get` function to fetch whatever URL is passed in and add the `await` keyword to await the response.
To get the response, we call its `text` method, and once again await it with the `await` keyword.
Both of these steps are asynchronous.
For the `get` function, we have to wait for the server to send back the first part of its response. This will include HTTP header, cookies and so on, and can be delivered separately from the response body.
Especially if the body is very large, it can take some time for it all to arrive.
If we have to wait for the _entirety_ of the response to arrive, the `text` method is also aync.
Here we have to explicity await both of these future, because futures in Rust are _lazy_
Futures will not do anthing until you ask them to with the `await` keyword. (Rust will show a compiler warning if you don't use a future)
This might remind you of iterators in the section [Processing a Series of Items With Iteraors](./Iterators.md).
Iterators do nothing unless you call their `next` method whether directly or by using `for` loops or methods such as `map` that use `next` under the hood.
Likewise, futures do nothing unless you explicitly ask them to.
This laziness allows Rust ot avoid running async code until its actually needed.
This is differnt from the behavoir we say before when unsing `thread::spawn` in [Creating a New Thread with `spawn`](./Simultaneous%20Code%20Running.md#creating-a-new-thread-with-spawn), where the closure we passed to another thead started running immediately.
This is also different from how many other languages approach async.
This is improtant for Rust and we will see why later.
Once we have `response_text`, we can then parse it into an intance of the `Html` type using `Html::parse`.
Instead of a raw string, we now have a data type we can work with the HTML as a richer data structure.
In particular, we can use the `select_first` method to find the first instace of a given CSS selector.
By passing the string `"title"`, we get the first `<title>` element in the document, if there is one.
Becuase there may not be any matching element, `select_first` returns an `Option<ElementRef>`.
Lastly we use the `Option::map` method, this lets us work with the item in the `Option` if it is present and do nothing if it isn't.
We could also use a `match` expression, but `map` is more idiomatic.
In the body of the function we supply to `map`, we call `inner_html` on `title_element` to get its content, which is a `String`.
When it is all done we have an `Option<String>`
Note that Rust's `await` kword goes _after_ the expression you are awaiting, not before it.
It is a _postfix_ keyword.
This may differ from what you may have used `async` in other languages, but in Rust it makes chains of methods much nice to work with.
This results that we can change the body of `page_url_for` to chain the `trpl::get` and `text` function calls together with `await` between them.
```rust
let response_text = trpl::get(url).await.text().await;
```
With this we have successfully written our first async function.
Befor we add some code in `main` to call it. We will dive even deep into what we have written and what it means.
When Rust sees a block mared with the `async` keyword, it compiles it into a unique, anonymous data tpye that implements the `Future` trait.
When Rust sees a function marked with `async`, it compiles it nto a non-async function whose body is an async block.
An async function's return type is the type of the anonymous data type the compiler creates for that async block.
Writing `async fn` is equivalent to writing a function that returns a _future_ of the return type.
To the compiler, a function definition such as the `async fn page_title` is equivalent ot a non-async function defined like this:
```rust
use std::future::Future;
use trpl::Html;
fn page_title(url: &str) -> impl Future<Output = Option<String>> + '_ {
async move {
let text = trpl::get(url).await.text().await;
Html::parse(&text)
.select_first("title")
.map(|title| title.inner_html())
}
}
```
Lets see go through each part of the transformed version:
- It uses the `impl Trait` syntax we discussed in Ch 10 in the ["Traits as Parameters"](./Traits.md#traits-as-parameters) section
- The returned trait is a `Future` with an associated type of `Output`.
- Note that the `Output` type is `Option<String>`, which is the same as the original return type from the `async fn` version of `page_title`.
- This async block produces a value with the type `Option<String>`.
- That value matches he `Output` type in the return type.
- This is just like other blocks you have seen.
- All of the code called in the body of the original function is wrapped in an `async move` block.
- Blocks are expressions.
- This whole block is the expression returned from the function.
- That value matches the Output type in the return type.
- The new function body is an `async move` block because of how it uses the `url` parameter.
- We will go into `async` versus `async move` later.
- The new version of the function has a kind of lifetime we haven't seen before in the output type: `'_`
- This is due to the function reutrns a future that refers to a reference, in this case, the reference form the `url` parameter.
- Here we need to tell Rust that we want that reference to be included.
- We don't have to name the lifetime here because Rust is smart enough to know there is only reference that could be involved, but we _do_ have to be explicit that the resulting future is bound by that lifetime.
## Determining a Single Page's Title
To start, we will just get a single page.
Here we follow the same pattern as we yused in Ch 12 to get command line arguments.
Then we pass the first pass the first URL `page_title` and await the resutl.
Due to the value produced by the future is an `Option<String>`, we use a `match` epxression to print different messages to account for whether the page has a `<title>`
```rust
extern crate trpl; // required for mdbook test
use trpl::Html;
async fn main() {
let args: Vec<String> = std::env::args().collect();
let url = &args[1];
match page_title(url).await {
Some(title) => println!("The title for {url} was {title}"),
None => println!("{url} had no title"),
}
}
async fn page_title(url: &str) -> Option<String> {
let response_text = trpl::get(url).await.text().await;
Html::parse(&response_text)
.select_first("title")
.map(|title_element| title_element.inner_html())
}
```
This will not compile.
The only place we can use the `await` keyowrd is in async functions or blocks.
Rust will not let us akr the special `main` function as `async`
We will get this error
```
error[E0752]: `main` function is not allowed to be `async`
--> src/main.rs:6:1
|
6 | async fn main() {
| ^^^^^^^^^^^^^^^ `main` function is not allowed to be `async`
```
The reason that `main` can't be marked `async` is that async code needs a *runtime*
A Rust crate that manages the details of executing asynchronous code.
A program's `main` function can *initialize* a runtime but it is not a runtime *itself*.
(We will see why this is the case in a bit)
Every Rust program that executes async has at least one place where it sets up a runtime that executes the futures.
Most languages that support async bundle a runtime, but Rust doesn't.
Instead, there are many different async runtimes available, each of which makes different tradeoffs suitable to the use case it targets.
An example fo this is a high-throughput web server with many CPU cores and a large amount of RAM has very has very differnt needs than a microcontroller with a ingle core, a samll amount of RAM, and no heap allocation ability.
The crates that provide those runtimes also often supply async versions of ocmmon functionality such as file or network I/O.
Throughout the rest of this chapter, we will use the `run` function from the `trpl` crate, which takes a futures as an argument and runs ti to completion.
Behind the scenes, calling `run` sets up a runtime that's is used to run the future passed in.
Once this completes, `run` returns whatever value the future produced.
We could have passed the future returned by `page_title` directly to `run`, and oce it compelted could match on the resulting `Option<String>` like as we did before.
However, for most of the examples in the chapter (and most async ocde in the real world), we'll be doing more than just one async function call.
Instead we will pass an `async` block and explicity await the result of the `parge_title` call.
Here is the updated version
```rust
fn main() {
let args: Vec<String> = std::env::args().collect();
trpl::run(async {
let url = &args[1];
match page_title(url).await {
Some(title) => println!("The title for {url} was {title}"),
None => println!("{url} had no title"),
}
})
}
```
Now when we run this code we get the behavior we expected initially
```
$ cargo run -- https://www.rust-lang.org
Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.05s
Running `target/debug/async_await 'https://www.rust-lang.org'`
The title for https://www.rust-lang.org was
Rust Programming Language
```
Now we funally have some working async code.
Bt befoer we add the code to race the two site against each other we will breifly turn back to how futures work.
Each *await point*, every place where the code uses the `await` keyword, represents a plcae where control is handed back to the runtime.
To make this work, Rust needs to keep track of the state involved in the async block so that the runtime can kick off some other owrk and then come back when it is ready to try advancing the first one agian.
This is an invisible state machine, as if you had written an enum lie this to save the current state at each await point
```rust
enum PageTitleFuture<'a> {
Initial { url: &'a str },
GetAwaitPoint { url: &'a str },
TextAwaitPoint { response: trpl::Response },
}
```
Writing the code to transition between each state by hand would be tedious and error-prone.
However, especially when you need to add more functionality and mroe states to the code later.
The Rust compiler creates and manages the statemachine data structures for async code automatically.
The normal borrowing and ownership rules around data structurs all still apply.
The compiler also handls checking those for us and provides useful error messages.
Ultimately, something has to execute this state machine, and that something is a runtime.
(This is why you may come across references to *executors* when looking into runtimes: an executor is the part od a runtime responsible for executing the async code)
You can now see why the compiler stopped us from making `main` itself an async function back.
If `main` were an async function, something else would need to manage the state machine for whatever future `main` returned, but `main` is the starting point for the program.
Instead we call the `trpl::run` function in `main` to set up a runtime and run the future returned by the `async` block until it retunrs `Ready`.
Note: Some runtimes provide macros so you *are* able to write an async `main` function.
These macros rewrite `async fn main() { ... }` to be a normal `fn main`.
This does the same thing as we did by hand before.
Call a function that runs a future to completion the way `trpl::run` does.
Now we can put these pieces tohether and see how we can write concurrent code.
## Racing Our Two URLs Against Each Other
Here we will call `page_title` with tow different URLs passed in from th comand line and race them.
```rust
use trpl::{Either, Html};
fn main() {
let args: Vec<String> = std::env::args().collect();
trpl::run(async {
let title_fut_1 = page_title(&args[1]);
let title_fut_2 = page_title(&args[2]);
let (url, maybe_title) =
match trpl::race(title_fut_1, title_fut_2).await {
Either::Left(left) => left,
Either::Right(right) => right,
};
println!("{url} returned first");
match maybe_title {
Some(title) => println!("Its page title is: '{title}'"),
None => println!("Its title could not be parsed."),
}
})
}
async fn page_title(url: &str) -> (&str, Option<String>) {
let text = trpl::get(url).await.text().await;
let title = Html::parse(&text)
.select_first("title")
.map(|title| title.inner_html());
(url, title)
}
```
Here we start off by calling `page_title` for each of the user-supplied URLs.
We save the resulting futures as `title_fut_1` and `title_fut_2`.
Now remember these don't do anything yet, this is due to futures being lazy we haven't awaited them.
Next we pass the futures to `trpl::race`, which returns a value to indicate which of the futures passed to it finishes first.
Note that under the hood, `race` is built on a more general function, `select`, which you will encounter more often in real-world Rust code.
A `select` function can do a lot of thinfs that `trpl::race` function can't, but it also has some additional complexity that we can kip over for now.
Either future can legitimately "win", so it doesn't make sense to return a `Result`.
Instead, `race` returns a `trpl::Either`.
This `Either` type is somewhat similar to a `Result` in that it has two cases.
Unlike `Result`, there is no notion of sucess or failure baked into `Either`.
Instead it uses `Left` and `Right` to indicate "one or the other"
```rust
enum Either<A, B> {
Left(A),
Right(B),
}
```
The `race` function returns `Left` with that future's output if the first argument wins and `Right` with the second future argument's output if *that* one wins.
This matches the order the arguments appear in when callign the function: the first argument is to the left of the second argument.
We also upadte `page_title` to return the same URL passed in.
Now if the page that returns first does not have a `<title>` we can resolve, we can still print a meaninful message.
With that information available, we wrap up by aupdating our `println!` output to indicate both which URL finished first and what, if any, the `<title>` is for the web page at that URL.
Here we have built a small working web scraper.
Now you can pcik a couple URLs and run the command line tool.
Some site are consistently faster than others, while in other cases the faster site varies form run to run.
This is the basics of working with futures, so now we can dig deeper into what we can do with async.

View File

@ -1 +1 @@
{"rustc_fingerprint":4305361489769004817,"outputs":{"13331785392996375709":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\npanic=\"unwind\"\nproc_macro\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_vendor=\"unknown\"\nunix\n","stderr":""},"2063776225603076451":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\npanic=\"unwind\"\nproc_macro\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_vendor=\"unknown\"\nunix\n","stderr":""},"17747080675513052775":{"success":true,"status":"","code":0,"stdout":"rustc 1.85.0 (4d91de4e4 2025-02-17)\nbinary: rustc\ncommit-hash: 4d91de4e48198da2e33413efdcd9cd2cc0c46688\ncommit-date: 2025-02-17\nhost: x86_64-unknown-linux-gnu\nrelease: 1.85.0\nLLVM version: 19.1.7\n","stderr":""}},"successes":{}}
{"rustc_fingerprint":4305361489769004817,"outputs":{"2063776225603076451":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\npanic=\"unwind\"\nproc_macro\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_vendor=\"unknown\"\nunix\n","stderr":""},"13331785392996375709":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\npanic=\"unwind\"\nproc_macro\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_vendor=\"unknown\"\nunix\n","stderr":""},"17747080675513052775":{"success":true,"status":"","code":0,"stdout":"rustc 1.85.0 (4d91de4e4 2025-02-17)\nbinary: rustc\ncommit-hash: 4d91de4e48198da2e33413efdcd9cd2cc0c46688\ncommit-date: 2025-02-17\nhost: x86_64-unknown-linux-gnu\nrelease: 1.85.0\nLLVM version: 19.1.7\n","stderr":""}},"successes":{}}

2127
hello-async/Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

7
hello-async/Cargo.toml Normal file
View File

@ -0,0 +1,7 @@
[package]
name = "hello-async"
version = "0.1.0"
edition = "2024"
[dependencies]
trpl = "0.2.0"

21
hello-async/src/main.rs Normal file
View File

@ -0,0 +1,21 @@
extern crate trpl;
use trpl::Html;
async fn main() {
let args: Vec<String> = std::env::args().collect();
let url = &args[1];
match page_title(url).await {
Some(title) => println!("The title for {url} was {title}"),
None => println!("{url} had no title"),
}
}
async fn page_title(url: &str) -> Option<String> {
let res = trpl::get(url).await;
let res_text = res.text().await;
Html::parse(&res_text)
.select_first("title")
.map(|title_element| title_element.inner_html())
}

View File

@ -1 +1 @@
{"rustc_fingerprint":4305361489769004817,"outputs":{"17747080675513052775":{"success":true,"status":"","code":0,"stdout":"rustc 1.85.0 (4d91de4e4 2025-02-17)\nbinary: rustc\ncommit-hash: 4d91de4e48198da2e33413efdcd9cd2cc0c46688\ncommit-date: 2025-02-17\nhost: x86_64-unknown-linux-gnu\nrelease: 1.85.0\nLLVM version: 19.1.7\n","stderr":""},"2063776225603076451":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\npanic=\"unwind\"\nproc_macro\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_vendor=\"unknown\"\nunix\n","stderr":""},"13331785392996375709":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\nfmt_debug=\"full\"\noverflow_checks\npanic=\"unwind\"\nproc_macro\nrelocation_model=\"pic\"\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_feature=\"x87\"\ntarget_has_atomic\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_has_atomic_equal_alignment=\"16\"\ntarget_has_atomic_equal_alignment=\"32\"\ntarget_has_atomic_equal_alignment=\"64\"\ntarget_has_atomic_equal_alignment=\"8\"\ntarget_has_atomic_equal_alignment=\"ptr\"\ntarget_has_atomic_load_store\ntarget_has_atomic_load_store=\"16\"\ntarget_has_atomic_load_store=\"32\"\ntarget_has_atomic_load_store=\"64\"\ntarget_has_atomic_load_store=\"8\"\ntarget_has_atomic_load_store=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_thread_local\ntarget_vendor=\"unknown\"\nub_checks\nunix\n","stderr":""}},"successes":{}}
{"rustc_fingerprint":4305361489769004817,"outputs":{"13331785392996375709":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\npanic=\"unwind\"\nproc_macro\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_vendor=\"unknown\"\nunix\n","stderr":""},"17747080675513052775":{"success":true,"status":"","code":0,"stdout":"rustc 1.85.0 (4d91de4e4 2025-02-17)\nbinary: rustc\ncommit-hash: 4d91de4e48198da2e33413efdcd9cd2cc0c46688\ncommit-date: 2025-02-17\nhost: x86_64-unknown-linux-gnu\nrelease: 1.85.0\nLLVM version: 19.1.7\n","stderr":""},"2063776225603076451":{"success":true,"status":"","code":0,"stdout":"___\nlib___.rlib\nlib___.so\nlib___.so\nlib___.a\nlib___.so\n/home/brock/.rustup/toolchains/stable-x86_64-unknown-linux-gnu\noff\npacked\nunpacked\n___\ndebug_assertions\npanic=\"unwind\"\nproc_macro\ntarget_abi=\"\"\ntarget_arch=\"x86_64\"\ntarget_endian=\"little\"\ntarget_env=\"gnu\"\ntarget_family=\"unix\"\ntarget_feature=\"fxsr\"\ntarget_feature=\"sse\"\ntarget_feature=\"sse2\"\ntarget_has_atomic=\"16\"\ntarget_has_atomic=\"32\"\ntarget_has_atomic=\"64\"\ntarget_has_atomic=\"8\"\ntarget_has_atomic=\"ptr\"\ntarget_os=\"linux\"\ntarget_pointer_width=\"64\"\ntarget_vendor=\"unknown\"\nunix\n","stderr":""}},"successes":{}}