Candid question: isn't the growth rate of functions the derivative of the Big O complexity instead?
O(1) does not mean that the growth rate is constant. It means that the number of operations is constant, and its derivative being zero, that means the growth is nil.
O(1) means the number of operations is bounded. f = O(g) measures growth in the sense that f/g = O(1), ie. g grows quickly enough to cancel out any tendency of f to go to infinity.