Hacker Newsnew | past | comments | ask | show | jobs | submit | tarlinian's commentslogin

Isn't the real reason to use char8_t over char that it that char8_t* are subject to the same strict aliasing rules as all other non-char primitive types? (i.e., the compiler doesn't have to worry that a char8_t* could point to any random piece of memory like it would for char*?).


At least in Chromium that wouldn't help us, because we disable strict aliasing (and have to, as there are at least a few core places where we violate it and porting to an alternative looks challenging; some of our core string-handling APIs that presume that wchar_t* and char16_t* are actually interconvertible on Windows, for example, would have to begin memcpying, which rules out certain API shapes and adds a perf cost to the rest).


The main effect of this is that some of the conversions between char and char8_t are inefficient.


Elpida was purchased by Micron after the financial crisis (they bought it out of bankdruptcy for a swan song in 2013). Much of the Micron DRAM you might buy is made at the former Elipida fab in Hiroshima.


They're also wrong in suggesting that J-1 visas are intended to just be used for "cultural exchange". They are quite explicitly intended for use by trainees[1]. I assume these folks are intended to be trained in New Mexico so they can ramp the advanced packaging fab in Malaysia when they need to.

[1] https://j1visa.state.gov/programs/trainee


Thank you for the link to Robin Sloan’s blog and temporarily answering my question about whether I should upgrade iOS. I love his books and for some reason never thought to look to see if he wrote anything else online.


I'd encourage you to get his email newsletter as well! He puts out really interesting and thoughtful content in my experience.


This just doesn't make technical sense. I completely agree that backdooring encryption standards is a bad thing. But Dual EC DRBG is a clear example of a NOBUS backdoor actually being that. The backdoor is equivalent to "knowing" a private key. The weakness is not some sort of computational reduction. Using this logic, you would say that no encryption method is possibly secure because you can't rely on its security once the key is exposed.


The reason it remained a "NOBUS" backdoor is because the whole world noticed something was funky with it pretty much immediately (even prior to standardization), and security researchers loudly told people not to use it. At that point the value of cracking open the backdoor is reduced significantly. It was standardized, and barely used except where mandated by governments, for less than 10 years.

There's no reason to think it would have remained a "NOBUS" backdoor forever. Especially if it was more widely used (i.e. higher value), and/or used for longer.

>Using this logic, you would say that no encryption method is possibly secure

I mean, to an extent that a little waterboarding will beat any encryption method, yes I would say that.

But, for 99.99% of people, your data isn't worth the waterboarding. On the flipside, a backdoor to, say, all TLS communication, would be very worth waterboarding people.


If the refund is paid in less than 45 days after the tax filing deadline or the day you file your taxes, they will pay you interest. See here: https://www.irs.gov/payments/interest#pay


Plenty of actual research costs count as overhead to avoid the need to hire an army of accountants to allocate every single bit of spend.

For example, the electricity costs of the lab in which the research is run would typically be paid for by the university and would be considered overhead. It's not "administrative bloat". Most of the particularly gross administrative bloat is on the undergraduate side of things where higher tuition costs have paid for more "activities".


Note that the institution I used as an example doesn't even have undergrads. It is not using NIH grants to cross-subsidize a college. Medical research is the only thing they do. And they are the #2 recipient of grants, after Johns Hopkins.


Caltech has undergrads.


Well how do you know that if you aren’t accounting for it?


F&A rates (facilities and administration, “indirects”) are subject to negotiation every 4 (IIRC) years, where those costs are accounted for (perhaps not well enough, but that is a separate point). The administrative component of F&A been capped at 26% for years and R1 universities are maxed out, so the negotiations are over the facilities component.


You can know what the research organization costs as a whole; and you can know what's "worth" charging to individual projects. The rest is indirect costs, which you can measure and use this data when negotiating indirect cost reimbursement with NSF or NIH.


I think the "X" looks like a pair of scissors. No good explanation for ZUndo though.


I like how you edited this to add "at work" after folks provided examples of it happening outside of work. If you'd like a slightly more work adjacent example, see the rampant increase in IEPs for students in the bay area. (I'm sure that the increased time for tests provided in many of these cases is not being abused at all...)


Contextually, it was always about being at work.

The entire conversation I have focused on work accommodations. I suppose I thought it was clear.

The bar for getting an accommodation at work has been higher than many other places (like bringing an emotional support animal to a store).

I can't speak to schools, that is also another very delicate social dynamic that has different incentives on how to handle these things than a place of business does.


How did you choose the process noise covariance in your `grad` function? It doesn't seem like a single process noise covariance structure should be globally applicable across all possible functions.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: