I think that what you are calling "immutable pass by reference" is what the OP is calling "pass by value". See, when used abstractly, "pass by value" means that the argument is passed as a value, hence it is immutable and the callee can't mutate it. One way to implement this is by copying the data that represents the value. In the OP's language, and in many other languages that work this way, instead of copying the data, we implement "pass by value" by incrementing the reference count and passing a pointer to the original data. These differing implementations provide the same abstract semantics, but differ in performance.
I am unable to extract any meaning from your post. You appear to be making a general claim: it is impossible to design a programming language where everything is a value. You at least admit that "data thingies" can be values. Are you claiming that it is not possible for functions to be values? (If we assume that the argument and the result of a function call is a value, then this would mean higher order functions are impossible, for example.) If not that, then what? Please give a specific example of something that can never be a value in any programming language that I care to design.
I think parent means it from a lambda calculus perspective. If you only have values at an AST level, then you only have a tree of.. values, like an XML document.
You can apply meaning to a particular shape of that tree which could be executed, but then you basically just added another layer before you parse your AST that becomes executable.
This is Trump's MAGA diet, a replacement for the lame liberal DEI diet of the Biden administration. Not hyperbole, the web site states all this explicitly if you click through to this link: <https://cdn.realfood.gov/Scientific%20Report.pdf>
The Scientific Report mentions Trump 4 times, so I looked up Trump's diet. Seems he eats a lot of McDonalds takeout and drinks a lot of diet coke. It seems to me that Trump's diet is an exemplary and healthy diet that follows these new recommendations, which prioritizes foods such as beef, oils and animal fat (including full fat dairy) and potatoes. Cheeseburger and fries, and the diet coke avoids added sugar, while promoting hydration. Trump might be prickly about past criticism of his diet; now he can point to these recommendations.
I did study Prolog in a past life but it never really stuck. It was vibe coded but I spent a lot of time planning prompts - I've had to deal with Claude's style (cruft explosion) in other projects, so I had my eyes open on this one.
no, they are talking about high performance desktops, mostly. They link to the Framework desktop, which has 256 GB/s memory bandwith. For comparison, the Apple Mac Pro has 800 GB/s memory bandwidth. Neither manufacturer is able to achieve these speeds using socketed memory.
> no, they are talking about high performance desktops
then i don't really get the "world has moved on"-claim. in my bubble socketed RAM is still the way to-go, be it for gaming or graphics work. of course Apple-user will use a Mac Pro, but saying that the world has moved on when it's about high-performance, deluxe edge-cases is a bit hyperbolic.
but maybe my POV is very outdated or whatever, not sure.
I agree and I do not agree. I still sometimes use a Thinkpad X230, and wait- a G4 PowerBook, and they are fine machines for many tasks. Yet even those have soldered CPUs, simply because of design constraints.
You don't need to have to train models. You want to play a game like Factorio, that, of all things, is bottlenecked on memory bandwidth - you must update each entity in a huge world on every tick, at 60 UPS, and yes, the game is insanely well optimized (check the dev blog). You don't have to play Factorio, but you also technically don't need DMA.
I think, but am not totally positive, this is primarily a concern for local LLM hardware. There are probably other niches, but I don't it's something most people need or would noticeably benefit from.
> I’d argue that actual closures which are unified everywhere as a single procedure type with non-capturing procedure values require some form of automatic-memory-management. That does not necessarily garbage collection nor ARC, but it could be something akin to RAII. This is all still automatic and against the philosophy of Odin.
C++ doesn't have this feature either. A C++ closure does not have the same type as a regular C-style function with the same argument types and result type. The types of functions and closures are not unified.
And C++ does have RAII, which the author feels is a kind of automatic memory management and against the philosophy of Odin.
So C++ doesn't have the feature G.B. says is impossible. I don't know enough to comment on Ada.
What Bill wrote, on his own web site, about his own language is simply this:
> For closures to work correctly would require a form of automatic memory management which will never be implemented into Odin.
I suppose you can insist Bill thinks "correctly" means all that verbiage about unified types - but then a reasonable question would be why doesn't Odin provide these "not correct" closures people enjoy in other languages ?
RAII is entirely irrelevant, the disposal of a closure over a Goose is the same as disposal of a Goose value itself. In practice I expect a language like Odin would prefer to close over references, but again Odin is able to dispose of references so what's the problem?
Ugh, it's like they went out of their way to not use the word Zamboni. Like when people say inline skates instead of Rollerblades or facial tissue instead of Kleenex. Yes, I know generic term versus a specific company. blah blah blah. But I feel like the UK has it okay when they say Hoovering instead of vacuuming. Same thing here.
This is to avoid trademark dilution, which in the USA can invalidate a trademark. Aspirin, for instance, used to be a trademark of Bayer, but these days is generic.
I did not regularly hear the term "game console" until the late 90s. I used to think the promotion of this term was done by Nintendo in a trademark-protective maneuver to avoid rival systems being called "Nintendos" by granny, but it seems I was mistaken. Nevertheless, in the 80s we called them systems. Which system do you have, Nintendo or Sega?
Recently in my retrogaming media habit I've heard "console" used occasionally to describe video game consoles in advertisements dating back to the early 80s, but at that time it was also used by Texas Instruments to refer to the TI-99/4A computer. TI was naming all of their home products to give a space-age technical feel to them. They marketed joysticks as "Wired Remote Controllers", and cartridges for the TI-99/4A as "Command Modules" or "Solid State Software". So I don't think "console" referring to a gaming device specifically was a term of art back then.
Aspirin is kind of a special case - while it had become generic in the usual way, the actual loss of trademark status was part of the Treaty of Versailles as a punishment for world war 1. (So while there are various trademark-protection strategies, "don't lose a world war" might be difficult to pull off :-)
reply