Difficulty to find one of many elements in an array

The question pretty much reads the title with a slight change. If I recall correctly, searching for an entry in an array of size "n" has O (n) complexity as the middle case.

I assume this is also the case if the vector has a fixed number of elements from which we want to find it.

But how is it if the number of records from which we are still trying to find them is somehow related to the size of the vector, i.e. grows with it somehow? I have such a case, but I do not know the exact relationship between the size of the array and the number of search entries. It can be linear, it can be logarithmic. Is the middle case still O (n)?

Any ideas would be grateful.

edit: example

array size: 100

the contents of the array: in each position the number 1-10, completely random, which.

what we're looking for: the first occurrence of "1"

from a naive point of view, we should, on average, find an entry after 10 searches in any kind of linear search (which we should do since the content is not sorted.)

As a rule of thumb in big-O the odds are usually omitted, does that mean we still need O (n) in time, although it should be O (n)

+3


source to share


3 answers


Anyway O (n).

Consider searching 1

here:



[9,9,9,9,9,1]

      

+1


source


If you are performing a linear search over an array, then the average time complexity of finding one of the elements M

in an array with elements N

will be O(I)

, where I

is the average index of the first of the required elements. If the array is randomly ordered, it I

will be on average O(N/M)

, so in the worst case it will also be O(N/M)

on average and O(N-M)

.



+1


source


I have two opinions on this issue.

First , if you consider an unsorted array (which will be shown here), the asymptotic complexity for the middle case would of course be O (n).

Let's take an example. We have n elements in an array, or better to say Vector. Now the middle case will search in a linear fashion using node to node. Which doesn't seem to matter at all n / 2 for the mean, or O (n) as the mean case. See, If elements are added, then the complexity of nature will not change, but the effect is clear, this is n / 2 comparisons for the mean --- which is 1/2 (half) of n. The effect for m elements will now be O (nm) after insertion into the array, or in comparison, (n-m)/2

comparisons are added as a result of adding elements to Vector!

So, we find that with an increase in the size of the array, or better to say Vector --- the complexity of nature will not change, although not. there will be more required comparisons, since on average it is n / 2.

Second , if the array or vector is sorted, then executing binary queries will have worst-case order logs (n + 1) --- again depending on n. Also, the middle case will increase comparisons logarithmically, but the O (log n) order of complexity will not change!

+1


source







All Articles