You realize that algorithmic complexity is the very definition of how fast a program runs, right?
No, I actually don't. Can you have an algorithmic complexity of, say, 25 fps or 10 kb/s?
no, algorithmic complexity determines how long an algorithm will take compared to another one.
a complexity of O(n^2) is going to take much longer than a complexity of O(n*log(n)), for example. In fact, programs that are much more complex, but have a smaller algorithmic complexity, will run faster than the simple solution.
EDIT: Though I will agree that in this case it is basically irrelevant.
EDIT EDIT:
Well, I should clarify that these don't give any speed or time, but are used solely for comparing algorithms to each other. The reason is that at large enough input, overhead stops mattering, so it doesn't make sense really to quantify exact times with messy coefficients and stuff.
Hence the title "Basic"