Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A beach sorts itself by size of sand, just by applying physics, so any array copied by parallel processes will sort itself given enough time, by computation time spend on the element aka the size of the rocks.


I think if you 'just apply physics' (let's say 'just apply computation' shall we?) then an array of numbers can only hope for this to happen In a kind of bogosort-style way?: Shuffle them all, and if now sorted, return! Else, loop and shuffle again, and so on. Such a hillarious sort algo.. but wait till u hear about bogobogosort! lollll!

But different sizes of sand do move against each other differently, and I think maybe that aspect is slightly reminiscent of the every-cell-for-themselves aspect of the cells described in the paper, and especially how the different rules allow different swapping operations when the swap-target is smaller or larger than the current cell. So I think it's a very relevant observation!


Its not.. parallelism in computation has physics, as in cycles spend per object. And you can totally sort something that way reliable. You can even "sieve" large objects out, by creating obstacles of size in that processed array.

It even has a "lightspeed" aka computational object size in cycles * nr of parallel processors. So if you have empty parts of your array, you can insert elements at traveldistance there without violating causality




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: