The demand for memory is going to result in more factories and production. As long as demand is high, there's still money to be made in going wide to the consumer market with thinner margins.
What I predict is that we won't advance in memory technology on the consumer side as quickly. For instance, a huge number of basic consumer use cases would be totally fine on DDR3 for the next decade. Older equipment can produce this; so it has value, and we may see platforms come out with newer designs on older fabs.
Chiplets are a huge sign of growth in that direction - you end up with multiple components fabbed on different processes coming together inside one processor. That lets older equipment still have a long life and gives the final SoC assembler the ability to select from a wide range of components.
> he took a PDF of my book Terraform: Up & Running, uploaded it into a GenAI tool, and asked the tool to follow the guidance in the book to generate Terraform code
This is ridiculous - AI doesn't need to be fed a PDF of a Terraform book to know how to Terraform. Blowing out context with hundreds of OCR'd pages of generic text on how to terraform isn't going to help anything.
The model that is broken is really ultimately going to be "content for hire". That's the industry that is going to be destroyed here because it's simply redundant now. Actual artwork, actual literature, actual music... these things are all safe as long as people actually want to experience the creations of others. Corporate artwork, simple documentation, elevator music.... these things are done; I'm sorry if you made a living making them but you were ultimately performing an artisinal task in a mostly soulless way.
I'm not talking about video game artists, mind you, I'm talking about the people who produced Corporate Memphis and Flat Design here. We'll all be better off if these people find a new calling in life.
You are talking about some video game artists, and while not their fault directly, EA pushing out SportsBall 2026 for the 40th year in a row is just a soulless corporate money printing machine.
I submit that the problem was Solaris 9 just being massively overtaken by desktop Linux in every way - easier to install, faster, simpler to patch, better GUI, more software available. Solaris 10 was too little too late; it already had to compete with mature Linux offerings that had a greater mindshare, and the excellent features like Zones were too far ahead of their time (and lacked the centralized store of downloadable Zone-prepackaged apps that is really the reason for Docker's success).
Purity doesn't matter in practice - especially in a world where OS installations are increasingly ephemeral and ideally immutable.
I always hear DTrace is awesome, but have never used it. And seem to have gotten by just fine... what am I actually missing?
ZFS, on the other hand, was so good that it has outlived Solaris itself and is at the core of e.g. TrueNAS as a commercial product.
Dtrace essentially allows two things. The first is exploring a live system, similar to being in a debugger but without its overhead. The other is monitoring; it's possible to notice that something is wrong earlier and with less overhead than monitoring logs and testing services.
Both aren't strictly necessary/much used in practice for various reasons, but depending on what you're doing, it's very handy to have. In a way similar to ZFS vs Stratis: the first ready to go, well-made, convenient, while the second is an absurd mess made by people who don't understand operations and think they know better from their development machine.
On Solaris... In my opinion, the real problem is less about Slowlaris and more about having chosen to target the élite for so long instead of targeting students, who are the future technicians, managers, etc.
GNU/Linux succeeded IMVHO because it was aimed at this audience, which is much broader, with many who weren't interested, but it's also the group that shapes 100% of every future generation of decision-makers and technicians. When SUN realized this, first with SXDE/CE and then OpenIndiana, it was, yes, damn late.
Yeah, I totally agree with your last point. I learned Solaris 9 in school, but had already been using Debian for years at that point, and it just felt like I was boxed out of having a real learning experience in the ecosystem; everything was locked up behind enterprise paywalls and traditional sales processes.
Could use some large scale geo-engineering. Pity that we don't have a radiation-free way of blowing a gigantic hole into the ground that can store a few trillion litres.
Probably bad idea, and definitely 'Need to bid it to responsible parties' question but would there be a way to safely use even separated 'landfill refuse' to build significant parts of the enclosing structure?
That’s almost always going to be cheaper to source from a nearby quarry than municipal sorting centres when you’re talk multiple millions of cubic meters.
Not great up here in Vancouver either - lots of rain but not snow. The problem with this is that even though we'll have full reservoirs at the start of the summer, when the rain ends, we deplete the lakes rapidly, and that slope downward gets steeper every year. It really makes me think that we'll need more dams, more reservoirs, to hold in more of the precious fresh water rather than letting it all run out. All winter long the rivers have been at really high flow rates because the lakes are full and the dams are wide open letting it go... but we'll miss that water in a few months!
It's an oscillation. It goes in cycles. Things formalize upward until you've reinvented XML, SOAP and WSDLs; then a new younger generation comes in and says "all that stuff is boring and tedious, here's this generation's version of duck typing", followed by another ten years of tacking strong types onto that.
MCP seems to be a new round of the cycle beginning again.
Bigger companies have vulnerability and version management toolsets like Snyk, Cycode, etc. to help keep things up to date at scale across lots of repos.
reply