Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Agreeing with you by example: In computational complexity the size of the input varies. In cryptography it typically doesn't: time is identical for all inputs of the same size.

Imagine a magic sorting algorithm that always takes 1 second to check if an input is sorted, and 9 seconds to sort the input if wasn't sorted. It then returns the sorted value as output.

EG sorting "abcdefg" would check (taking 1 second) and then return (taking 0 seconds since it's sorted). Sorting "gfedcba" would check (taking 1 second) and then sort (taking 9 seconds) and return. Taking in the complete works of Shakespeare it would check (taking 1 second) and then sort (taking 9 seconds) and return.

It's O(1), yet the time varies based on the input. From a computational complexity terminology standpoint it's constant time, from a cryptographic terminology (and common intuition) standpoint it's clearly not, since it doesn't always take the same time to run.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: