janriemer, to rust

Clippy's lints for dealing with in :

https://rust-lang.github.io/rust-clippy/master/#/async

(Note, that this is just a search for the term "async" in the Clippy docs - there is no special "category" involved here).

janriemer, to rust

Yay! @notgull is working on smol integration with ! :awesome:

https://github.com/notgull/smol-axum

The smol runtime:
https://github.com/smol-rs/smol

is not the end-all-be-all runtime. You should keep looking at alternatives and what unique features they can offer. ✨

Don't be blinded by "if it is the most popular, it must be the best" fallacy.

janriemer, to rust

This thing will (probably) blow up 🚀

mfio - Framework for I/O Systems:

https://github.com/memflow/mfio

"mfio is a one-stop shop for custom async I/O systems. It allows you to go wild, beyond typical OS APIs.[...]"

  • Async
  • Automatic batching (vectoring)
  • Fragmentation
  • Partial success
  • Lack of color (full sync support)
  • I/O directly to the stack
  • Using without standard library

janriemer, to rust

makes some happy screaming noise :awesome: :ferris:

will end this year with a long awaited feature that will define it's future:

We will get "async fn and return-position impl Trait in trait" ( & ).

It will be stabilized in the next version 1.75, which will be released on 28 December, 2023.

https://releases.rs/docs/1.75.0/

PR:
https://github.com/rust-lang/rust/pull/115822

Thank you Rust for all of your hard work! ❤️

janriemer,

@matze Yes, that's true.

However, writing the desugared version in the trait is still compatible with using async fn in the trait impl, so the "burden" is more on lib maintainers.

Also this problem only applies when the Self type is generic.

And Send is only required, if one decides to use a multithreaded rt.

I highly recommend the following article by @notgull about smol:

Why you might actually want async in your project

https://notgull.net/why-you-want-async/

notgull, to random
@notgull@hachyderm.io avatar

Answering a frequently asked question: how do you do concurrent combinators in smol?

https://notgull.net/futures-concurrency-in-smol/

janriemer,

@notgull This is such a good blog post. I've learned a lot!

Thank you for sharing. ❤️

"The best part is that the allocation, the Vec<smol::Task<()>>, isn’t even necessary. It could be one-time allocation that is just extended to hold the tasks."

Wow, this is mind-blowing to me - I haven't even considered this before! 🤯

Memory-reuse FTW! :awesome:

For more visibility =>

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • rosin
  • thenastyranch
  • cubers
  • ethstaker
  • InstantRegret
  • DreamBathrooms
  • ngwrru68w68
  • magazineikmin
  • everett
  • Youngstown
  • mdbf
  • slotface
  • kavyap
  • anitta
  • GTA5RPClips
  • khanakhh
  • normalnudes
  • osvaldo12
  • provamag3
  • cisconetworking
  • Durango
  • tacticalgear
  • modclub
  • Leos
  • megavids
  • tester
  • lostlight
  • All magazines