Hacker Newsnew | past | comments | ask | show | jobs | submit | GoodDreams's commentslogin

Not sure if I had the same issues but my situation was resolved by cleaning my lightning port and using a quality cable.


I want tested code in my docs. I want my docs’ build to fail if sample code doesn’t actually work.


We built something like this at my previous employer, on top of asciidoc. With integration tests in both directions: documented code was executed and tested for correctness, and the product build would fail if certain features didn't have corresponding documentation.

Automating these things is a pretty easy way of enforcing the availability of documentation. It does nothing to check whether the documentation is half way readable, however ...


"most" to me means more than half. Many folks think it means somewhere around 2/3 or 3/4.


I wonder what percentage are afraid of non-autonomous cars.


I’m interested in understanding what it means to be a military minded coder.


Slow is smooth. Smooth is fast.


Festina lente


On the field there's no time for false drama or fluff, you have resources (time, energy, devices) and you keep doing the best you can at every step. And if you don't think enough about how you plan your operations, you die.

I don't want my teammates to feel on the verge of death, but I really, really work better if I'm operating at high pace and density and if the team also does that, like swarm of people attacking all problems at all levels on the job.


It’s likely a reference to the story in the linked article


Reminds me of: People said you can’t write a financial trading system in a garbage collected language. Turns out you can as long as you don’t garbage collect during the trading day by carefully managing allocations and manually running GC each day before trading starts.

Or the old story of the junior engineer who finds a memory leak in a missile’s guidance system and the senior engineer says the memory leak is fine as long as you don’t run out of memory before the missile completes its mission.


"junior"? :P

https://devblogs.microsoft.com/oldnewthing/20180228-00/?p=98...

_________

From: k...@rational.com (Kent Mitchell)

Subject: Re: Does memory leak?

Date: 1995/03/31

Norman H. Cohen (nco...@watson.ibm.com) wrote:

: The only programs I know of with deliberate memory leaks are those whose

: executions are short enough, and whose target machines have enough

: virtual memory space, that running out of memory is not a concern.

: (This class of programs includes many student programming exercises and

: some simple applets and utilities; it includes few if any embedded or

: safety-critical programs.)

This sparked an interesting memory for me. I was once working with a

customer who was producing on-board software for a missile. In my analysis

of the code, I pointed out that they had a number of problems with storage

leaks. Imagine my surprise when the customers chief software engineer said

"Of course it leaks". He went on to point out that they had calculated the

amount of memory the application would leak in the total possible flight time

for the missile and then doubled that number. They added this much

additional memory to the hardware to "support" the leaks. Since the missile

will explode when it hits its target or at the end of its flight, the

ultimate in garbage collection is performed without programmer intervention.

--

Kent Mitchell | One possible reason that things aren't

Technical Consultant | going according to plan is .....

Rational Software Corporation | that there never was a plan!


This is why we need a Software Engineering license.

When the point of the product is to kill someone, you can't just stochastically measure shit like this. If the device is ever out of control before it ceases operations, you're looking at Geneva Convention level offenses.

Even software designed to save lives can't get away with this sort of thing.


Teachers have licenses. It does not make them any good at teaching though.


Well no it doesn’t but it does however assure the public that all certified teachers have a duty to care for your children in a safe environment. In which teachers are also duty bound to report anything illegal or potentially harmful as they are liable in court if something does end up happening


Doctors have licences and that doesn't necessarily make them any good at doctoring either, but that doesn't mean you should prefer one without a license.


Bad analogies are bad analogies. A doctor can be a matter of life and death.


>you're looking at Geneva Convention level offenses.

That's a joke. Russia is intentionally bombing Ukrainian civilians, you think they're going to have a single War Crimes charge put against them? How many civilians did the US unintentionally kill across Afghanistan and Iraq, hundreds of thousands? See any charges there either?


Just because charges don't get brought doesn't mean that you didn't commit a crime.

I don't know how I could possibly get one but I'd love to read an independent comparison of this sort of behaviour between US operations and Russian ones. My belief is that the US aims not to kill civilians but is often careless and nets a lot of collateral damage, whereas Russia doesn't care at all and will happily bomb schools if it thinks there's a target in there. But of course I mostly read Western media and writing so my view is potentially very biased.


You've just asserted that it's stochastic with no basis. It could easily be (and in fact is much more likely to be) a periodic operation that leaks memory.

This sort of analysis is completely normal in software. E.g. in safety critical software you often analyse maximum possible stack depth to check that your stack is big enough (one of the reasons why recursive code is sometimes disallowed). This is exactly the same class of analysis.


Isn't the reason for Software to flourish in last 20 years is that we have had no regulation? Adding regulation would then lead to contraction of software industry as we know it. May be the regulation should be not on employees, but the products in categories where they are subjected to life/death situations.

If you think getting a license is the way to fix quality issues, I have a bridge to sell you.


I do wonder what happens if the missile runs out of memory

It just drops?


Presumably undefined unless specifically designed to fail safe but a logical thought would be it keeps burning at present trajectory until it falls and blows up at end of trajectory if device uses an impact fuse. I would be very interested in a more precise answer if you can find one.

https://www.scienceabc.com/innovation/why-do-some-missiles-e...


There's a star trek episode about such a thing, https://memory-alpha.fandom.com/wiki/Warhead_(episode)

Apparently it will "hijack" more memory and attempt to reconstruct it's orders.

But, real world, I would say that data corruption would occur and outside of chemical/physics, it will just drop and lockup.

May have a failsafe to detonate, but that is a fun question indeed.


> People said you can’t write a financial trading system in a garbage collected language

People have been writing HFT systems in GC languages (Java, OCaml) for a while, and as you mentioned, you need to know when to pause/unpause/prevent collection in the critical section.

This [1] is an excerpt from a talk with Dave Lauer [1], who has done it in Java - it took his system ~40 μs from receiving data to producing market order, and he explains in great detail what amount of work the system had to do. And this was >10 years ago.

[1] https://youtu.be/1ah7XokvcwA?t=785


> you need to know when to pause/unpause/prevent collection in the critical section.

That's his point. It's like saying humans can walk over a lake, as long as it's dry season and the lake is now 10 inches deep.

You can use garbage collected languages as long as you don't collect. (!?!?!)

I'm even more hardline. IMO everything time or life-critical should have static memory allocation, period.


sub 10 in Java was very doable 10 years ago, fwiw. even better than collecting once a day is doing everything off-heap — C-style java :D


Just keep your objects live, attach reusable scratch space to them, and memory pool them instead of throwing them away.

(from my experience anyway in writing critical-path allocation-free C# with Unity).


ya, also keep everything in cache to be below a few mics


Or to put it in other terms, you want to consume memory at a rate lower than propellant.


Well, for ballistic missiles a large portion of the flight will be the mid-course phase (which is typically unpowered) and the terminal phase may use control surfaces for maneuvering. For cruise missiles that's a good rule of thumb.


Not typically, always. “Ballistic” in “ballistic missile” means “moving under the force of gravity only”


Might you not use some fuel for countermeasures during the midcourse phase?


Yeah, a ballistic missile's bus still does a lot of horizontal maneuvering during the mid-course phase to actually get on target; the main boost phase is mainly just for getting it pointed in the general direction and proving verticality


I imagine RAM weight is not a relevant parameter on the rocket equation.


Reminds me of a sales call to a military contractor. They were interested in an Open Source OS because of the licensing costs. The embedded OS they were using had a license that said you could run as many instances as you liked on a "single device".

Their problem was that when they operated their device it was indeed a single device, but later became many more. If they paid for the extra licenses they didn't need it would be ripping off the government; if they failed to pay for the extra license at some point they would be in breach of their contract with the government as all these unlicensed devices were released.

It rapidly became obvious that their application was a MIRV and that they weren't really going to change but had asked us to talk with them as part of their process of negotiating a deal with their current supplier.

It's hard to imagine really worrying about your license agreements if there actually were a global thermonuclear war.


That reminds me of a short sci-fi story I read about an alien invasion, in which humans were losing horribly until we learned about their legal system and sued them into submission


That is an amazing story.


> Turns out you can as long as you don’t garbage collect during the trading day by carefully managing allocations and manually running GC each day before trading starts.

The surprising thing here is that it is considered worth it with those limitations.

My hope/guess is that because of the GC hype there wasn't any good alternatives.

With the alternatives gaining maturity now I'd hope that that decision wouldn't be made today (when starting from a clean slate).


> The surprising thing here is that it is considered worth it with those limitations.

It shouldn't be that surprising. Languages like Java and C# are perfectly sensible languages for writing the vast majority of a lot of trading systems. So it's eminently reasonable to use them, or equivalent, and deal with the bits that are GC-phobic as special cases.

I've worked with/on trading systems written is mainstream languages including C++, Java, C# and Python, as well as less mainstream such as APL and Smalltalk. I can safely say that there are way more critical concerns than whether or not it uses a GC.


We have Rust now, and some safe memory usage patterns for C++. GC just increases your energy bill and helps heat up the planet.


> GC just increases your energy bill

How? GC does its work in large batches once in a while, whereas for example Arc (the only serious alternative to what GC brings, which is the ability to have arbitrary graphs of objects) does its work much more frequently, with additional effort expended on atomic operations in a multithreaded environment, and with additional memory writes to boot. Somehow I doubt that GC "just increases your energy bill".


[Citation needed] Rust encourages reference counting, a particularly inefficient form of GC which actually turns reads into writes(!), and it cannot move objects to compact memory except when it can prove that every object has only a single reference.


>We have Rust now, and some safe memory usage patterns for C++.

>GC just increases your energy bill and helps heat up the planet.

Were C++ compilation times that use 100% of my PC taken into consideration?

And not just for final binary, but also the compilations that happen everyday for development purpose too


I usually hear those sorts of rationalizations in companies that have already become feature factories. We just have to go fast. We can't make people slow down and learn discipline, or if we have, we can't go back and fix the past because we have to go go go.

In retrospect I think I should have put more effort into breaking the feature factory culture than the discipline culture. It's hard to get clean if your drug of choice is ready at hand.


> it is considered worth it with those limitations.

These people make decisions based on Excel spreadsheets that are not properly documented. I wouldn't trust their judgement, despite the fact they managed not to crash the global financial system too frequently.


For many processes where the lifecycle is limited, freeing memory can be waste of time; most notably freeing memory just before exiting is complete waste.


Depending on your OS. And if you want to profile or move code around, it's better to free it asap.


There's a reason why lots of HFT firms have massive amounts of RAM. It's easier to get the code out first, and handle leaks with a reboot later.


And RAM is quite cheap these days compared to the money the hardware is supposed to make.


Is that really an old story? There's usually no memory allocation after initial startup in safety- and mission-critical systems. Rolling over past the max value of an integer data type is a thing though, and they do reboot for that.


The parent probably meant it more as an urban legend rather than some sort of documented event.


No, it's a real story. See sibling replies.


“The End of Empathy” is an Invisibilia (NPR) episode arguing empathy for a person harms others. I find the episode to be an interesting window into how some folk think. https://www.npr.org/programs/invisibilia/712280114/the-end-o...


A law professor explaining why you should never say anything to the police: https://youtu.be/d-7o9xYp7eE


Same. I considered switching iCloud accounts when I switch between work/personal modes but decided that the risk of accidentally leak information between account or losing data was too great.

At least with the Macs I can create a separate OS user account and have that account signed into the other iCloud account. Think iPad OS will support multiple users some day?


With Sidecar I was hoping that physically connecting the iPad to the Mac would be enough. Would a physical connection (with perhaps an additional prompt) be a sufficiently secure authorization alternative? Putting my opsec hat on: I suppose someone could (theoretically) use an adapter to tunnel USB/Thunderbolt through a wireless connection to work around the need for a physical connection.


Didn’t think of that, good point!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: