Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> "Technically" O(n) is the only O(n).

In idealized algorithmic analysis, but not necessarily real life. "Amortized O(1)," which I assume you concede is a commonly-used, meaningful and legitimate term, means "technically" an idealized O(>1) but O(1) in practice.

Calling memcpy inside a Ruby method call is amortized O(1) because for any "n" that fits within available memory, it will always be much faster than the other things in a Ruby method call, which involve dozens of locks, hash table lookups with string keys, dynamic type checks, additional Ruby method calls and so forth.

Likewise, computational complexity on an idealized Von Neumann machine isn't always the same on a real computer, in both directions. Dynamic allocations are theoretically O(n) but may be O(1) if the program never exceeds the preallocated space. Or suppose there were a loop over an array of pointers which dereferenced each pointer; the dereferences are theoretically O(1) but may be O(n) if they evict the parent array from the cache.

> What is the common case in your view?

Such as an array small enough that it can be copied with 10 or fewer vector load/stores.

> O(3n) = O(2n) = O(n)

Yes, that's my point. It's impossible to implement the example in less than idealized O(n) time, so O(n) and O(1) operations are equivalent complexity-wise WRT the entire method.



> In idealized algorithmic analysis, but not necessarily real life.

Big O notation is used for idealized algorithmic analysis. If you want to talk about real life, you can count cycles, seconds, watts etc.

> "Amortized O(1)," which I assume you concede is a commonly-used, meaningful and legitimate term, means "technically" an idealized O(>1) but O(1) in practice.

Yes, but I wouldn't take O(1) on its own to imply amortized complexity. Not that pretending that an array copy is O(1) in practice is particularly useful here since if you measure a copy operation in practice, you'll find that the time it takes scales roughly linearly with the size of the array. Not to mention that the space complexity is O(n) no matter how you put it.

> Such as an array small enough that it can be copied with 10 or fewer vector load/stores.

Are other cases conversely "uncommon"? My point here is that this is entirely your opinion and doesn't pertain to whether an array copy is O(1) or O(n) complex.

> Yes, that's my point. It's impossible to implement the example in less than idealized O(n) time, so O(n) and O(1) operations are equivalent complexity-wise WRT the entire method.

Not in terms of space.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: