Could an operation that takes O (1) amortized time have a worst-case O (n ^ 2) time?

If the operation has amortization time O (1), can it take O (N ^ 2) time in the worst case?

+3


source to share


1 answer


Yes it is possible. Amortized complexity takes into account the worst-case frequency. Thus, once the worst case occurs at about 1 in operations N^2

, the amortized complexity will be constant.

Take a simple example - a dynamically expanding array (I'll call it vector

what it's called in C ++), most languages ​​have amortized constant time to push an element back. Most of the time clicking an item is a simple assignment of a value, but every now and then all of the selected items will be assigned and we need to reallocate the vector. This would be the worst case of the operation push_back

, and when it does, the operation is performed with linear complexity. However, the growth path of the vector ensures that redistribution is often sufficient. Each time the vector is redistributed, it doubles its size. This way, before another reallocation happens, we'll have n

simple push_back operations (assumingn

was the bandwidth of the vector before reallocation). As a result, the worst case of linear complexity occurs no more than once in a linear number of operations.



Similarly to the above, consider a data structure that reallocates in O (n ^ 2), but ensures that the reallocation is done no more than once in n ^ 2 constant operations. This would be an example of an operation with amortized complexity O (1) and worst case complexity O (N ^ 2).

+3


source







All Articles