are there any tools to convert large latex documents to typst ? it looks a huge improvement, but the migration path is the only thing that's stopping me.
In places where there is not much time for code refactoring, the following is helpful:
Imagine an idealised future state of the codebase, which everyone buys into, and make sure any new feature is going in that direction.
Refactoring existing code can be death by a thousand cuts- having a parallel new codebase which is incrementally adopted can be more efficient and quicker to market.
This is fairly obscure but the problem he highlights can be overcome easily by localising at the saturation of S_f, for D(f)=D(g) if and only if the saturations are equal, and localising at saturation is an isomorphism
It seems though that we would want the computer to be able to do this kind of reasoning itself, and not rely on humans "pre-resolving" all such problems.
I understood the presentation to be on the distinction between equality and isomorphism and when an isomorphism "is an equality". From the slide on homotopy type theory I get the impression that he finds it unsatisfactory to simply consider all isomorphisms as equalities.
So many times it’s necessary to ‘identify’ two more objects which are isomorphic, and the ‘canonical’ is supposed to justify why this doesn’t cause a problem.
The reason it is necessary here is that the mapping D(f) -> A_f is a priori multi valued, and we need it to be single valued.
However it’s fairly easy to make a definition which is trivially single valued , so I don’t think it’s a very instructive example of the phenomenon.
Probably more pertinent is an n-fold tensor product with different bracket ordering
Stacks, like EGA before it, is a wonderful reference but a terrible textbook. I think even the authors would agree with this! Fortunately there are many other books from which to learn algebraic geometry, after which Stacks will start to make a lot more sense.
Modern Algebraic Geometry is indeed highly abstract, but generally the conjuring of obscure objects is with a specific goal in mind, for example
- consolidation of many types of results into a `simple' theoretical framework, I suppose this originates with Noether, and reaches it apotheosis in Bourbaki's tracts.
- embedding of 'classical' objects (solutions to polynomial equations) inside a larger 'category' (schemes) where certain mysterious relations observed in the classical world (Weil Conjectures) have a more `natural' interpretation (fixed point theorem) and light the way to a proof which would have otherwise been beyond reach
I’m generally positive on rewriting openssl in rust, but agree the comparisons aren’t completely scientific or necessarily more important than correctness. First you should compare performance using the same implementations of cryptographic primitives as these are relatively easily fungible. We also don’t know if for example OpenSSL optimised primitives were used. Secondly the rust language only excludes memory bugs, it doesn’t exclude errors in the implementation of the tls protocol or incorrect usage of cryptographic primitives which can be just as catastrophic for security. These have been prevalent in OpenSSL and are somewhat harder to prevent a priori. For all we know these issues are worse for rustls than OpenSSL. This is where formally verified implementations would be useful.
> Secondly the rust language only excludes memory bugs, it doesn’t exclude errors in the implementation of the tls protocol or incorrect usage of cryptographic primitives which can be just as catastrophic for security.
I linked to a talk about the project elsewhere and it's worth noting that the author of rustls leverages a lot of rust techniques that ensure certain correctness attributes at a semantic level, not just memory safety.
In particular, TLS libraries have long suffered from dealing with the complex composite state machines required by the protocol[0]. Rust makes the expression of safe state machines pretty easy (the talk demonstrates how).
one issue with OOP in practice that I've seen is the entanglement of the domain representation (member variables of a class) and the varied operations on that data (methods). Classic OOP encourages you to manipulate objects by methods rather than free functions, which combines potentially unrelated functionality in the same object.
my rule of thumb with objects is to keep the methods to a minimum, to the extent all classes are either interfaces, implementations of interfaces or pure data classes. obviously this approach will be natural to ML programmers.
The difficulty with learning 'modern' algebraic geometry is not only is it very dense and general, but that means the original motivation can become lost.
So I think understanding Weil conjectures are key for modern algebraic geometry. And it's always easier to understand algebraic curves (algebraic geometry with dimension 1) and their connection to Riemann surfaces (algebraic curves over the complex numbers with analytic rather then algebraic structure), as they provide motivation for many of the results and constructions.
A good introduction to Algebraic Curves and the Weil conjectures I've found is following
This is by far the most what the fuck language that I've seen in a long while. It displays an astonishing amount of creativity and novel (at least to me) ideas, most of which are probably bad and won't survive for long, but some of them are really cool! Things like "packed loops"
Fortran does this (still does in F2008). But usually I end up using subroutines instead, since I want the code in question to modify more than one object. Functions can only return one object, and the overhead of packing/unpacking to a struct is annoying.
What Fortran really needs to get is first-class string handling and saner IO. Apart from that being a pain, modern Fortran is a nice language to use.
I wonder if the lack of strongly statically typed embeddable languages is due to an accident or a fundamental limitation of implementations of such languages. Is the type checker code too big? Is it too complex?
Moreover, designing the actual type system is at least as hard as implementing the type checker. Coming up with a static type system that is flexible/ergonomic, expressive/powerful, sound, easy to use, and also a good match for mutable state and imperative code is really, really difficult, as demonstrated by C, C++, and Java.
A flexible/ergonomic dynamic type system is trivial in comparison. Soundness is handled at runtime when the exact value of each expression is known, and ease of use is almost a given.
It's actually extremely easy, particularly if you're familiar with Prolog (the core algorithm is a couple lines in Prolog, so knowing how that works is useful for implementing it in other languages).
I suspect the lack of statically typed scripting languages is more of a historical accident than anything.
couldn't agree more, see e.g. similar restrictions in well-known bastions of socialism singapore, denmark and australia. non-resident investment in manhattan is literally rent-seeking. perhaps if the foreign investment was redirected to regenerating post-industrial cities, or even actual job creation society as a whole would be better off