> I see how you may think that, but I would argue that I am not redefining anything, as things turn nonsensical without this assumption.
The point I'm making is that this statement reduces to the claim that defining an algorithm's worst case time complexity as O(1) or constant time is nonsensical.
Do you disagree that adding two numbers is a constant time operation?
> Do you disagree that adding two numbers is a constant time operation?
It is constant time by crypto programming definition in both theory and practice, and O(1) only in theory (bigints are a pain).
I did not try to claim that O(1) is nonsensical. Rather, that certain O(1) variable-input algorithms are in fact simply fixed input algorithms due to not considering its input.
I also find these "non-sensical" O(1) algorithms to be outliers.
Do you disagree that adding two numbers is a constant time operation?
I do. A general algorithm for adding two numbers, at least one represented in the conventional way as a series of digits in some base, has a lower bound of O(log n).
The point I'm making is that this statement reduces to the claim that defining an algorithm's worst case time complexity as O(1) or constant time is nonsensical.
Do you disagree that adding two numbers is a constant time operation?