Runtime problem with Runtime
Algo takes 0.5ms seconds for an input size of 100, how long will it take to run if the input size is 500 and the program is O (n lg (n))?
My book says that when the size of the input is doubled, n lg (n) takes "just over double". It doesn't help me much.
The way I did it, this is a constant multiplier solution (which the book doesn't talk about, so I don't know if it really is):
.5ms = c * 100 * lg (100) => c = .000753
So,
.000753 * 500 * lg (500) = 3.37562ms
Is this the correct way to calculate runtime, and is there a better way to figure it out?
This is not entirely correct. Thomas was right in saying there is overhead and the real equation looks more like
runtime = inputSize * lg(inputSize) * singleInputProcessTime + overhead
SingleInputProcessTime refers to machine operations like loading address spaces, arithmetic, or whatever needs to be done every time you interact with input. This is usually a runtime that varies from a few CPU cycles to seconds or minutes depending on your domain. It is important to understand that this time is grossly CONSTANT and therefore does not affect the overall execution time of very large input sizes. LARGE SINGLE input sizes.
Overhead is the cost of setting up a problem / solution, such as reading an algorithm into memory, distributing input between servers / processes, or any operation that needs to be done only once or a certain number of times that does NOT depend on the input size. This cost is also constant and can range from a few CPU cycles to minutes depending on the method used to solve the problem.
The inputSize and n * lg (n) values, which you already know about, but as far as your home problem is concerned, while you explain how you got to the solution, you should be fine.
source to share