IP-Asia met every week via Zoom. Several other people whose names appear in the same literature frequented it too. Pop in tonight for the final session?
Thanks for that link. I attended that at 4am california time. So sorry I didn't ever get to talk to him.
It was like attending a funeral for someone I never was able to track down. Feel kinda terrible about all of it. Really sucks, sounds like he was a very friendly guy.
It's desirable to be able to express the idea that we want to continually poll drive one asynchronous operation to completion while periodically checking if some other thing has happened and taking action based on that, and then continue driving forward the ongoing operation.
This idea may be desirable; but, a deadlock is possible if there's a dependency between the two operations. The crux is the "and then continue," which I'm taking to mean that the first operation is meant to pause whilst the second operation occurs. The use of `&mut` in the code specifically enables that too.
If it's OK for the first operation to run concurrently with the other thing, then wrt. Tokio's APIs, have you seen LocalSet[1]? Specifically:
let local = LocalSet::new();
local.spawn_local(async move {
sleep(Duration::from_millis(500)).await;
do_async_thing("op2", lock.clone()).await;
});
local.run_until(&mut future1).await;
This code expresses your idea under a concurrent environment that resolves the deadlock. However, `op2` will still never acquire the lock because `op1` is first in the queue. I strongly suspect that isn't the intended behaviour; but, it's also what would have happened if the `select!` code had worked as imagined.
That's true but it was little more than a portable rendering API, which was of course very useful for the PS2, but probably less interesting for the PC ports. So if you want to count that, you're right, it's not totally from scratch. But it wasn't built on an actual game engine.
AsyncLocalStorage.enterWith is the wrong method; .enterWith changes the logger for across the *synchronous* execution. This doesn't matter if there's only one request happening at time-- like when you're testing locally. But that's why it didn't work on the actual project.
I have enterWith working perfectly with async/sync flow - all operations between request and response are logged with request id transparently. I am using custom server for both standalone build & dev server, nextjs 14, node 22.
Humans only retrieve information in a library in that way due to the past limitations on retrieval and processing. The invention of technologies like tables of contents or even the Dewey Decimal Classification are strongly constrained by fundamental technologies like ... the alphabet! And remember, not all languages are alphabetic. And embeddings aren't alphabetic and don't share the same constraints.
I recommend Judith Flanders' "A Place for Everything" as a both a history and survey of the constraints in sorting and organising information in an alphabetic language. It's also a fun read!
tl;dr why would we want an LLM do something as inefficiently as a human?
"why would we want an LLM do something as inefficiently as a human?" -- That is a good point. Maybe we should rename artificial intelligence (AI) to super-artificial intelligence (SAI).
"Making the decision to have a child - it is momentous. It is to decide forever to have your heart go walking around outside your body." -- Elizabeth Stone
AIUI early transistor research was funded by public (defense) grants. And Bell Labs was definitely government supported, if not outright government funded.