|Big-O notation for beginners
||[Apr. 26th, 2010|03:20 pm]
This morning, someone posted the following to Proggit:
I'm studying for a final in an intro to comp sciences class, and something that I never understood was this O(n2) thing. Now, that looks a lot to me like someone asking for help with their homework, which is rightly verboten. But I felt like
The example he gives is of sorting records. On the last (midterm) exam, the question arose:
"A sort (such as a bubble sort or insertion sort) is O(n2). 1000 records are sorted in 1 second. How long does it take to sort 3000 records?"
I could not for the life of me figure out how he came to 1 second from 1000 records. (and I got the answer wrong :\ )
Could anyone explain this to me, pretty please? I'd appreciate it!
slacking off being helpful, so I wrote out a longish response. But when I clicked "Submit", I was told "The link you are replying to has been deleted". Humph. Only one thing to do: post it here, in the hope that someone will find it useful. Here it is:
I could not for the life of me figure out how he came to 1 second from 1000 records.
He didn't, he just made that figure up for the sake of asking a question. In a real-life situation, you'd have to measure that number experimentally. Here's a vaguely analogous question, with the equivalent part shown in bold:
Five labourers set out to dig some holes. Each labourer can dig a hole in half an hour. How long will it take them to dig ten holes?Now, how would you solve that? Well, there are five labourers and ten holes to dig, so each one must dig two holes. Hence, it will take them 2*0.5 = 1hr.
There's an implicit assumption in my statement of that question, which is that if it takes half an hour to dig one hole, it will take an hour to dig two holes, two hours to dig four holes, and so on. In the jargon, I'm assuming that hole-digging is linear in the number of holes. We can also say that hole-digging is O(n), which means that the time taken to dig n holes is roughly proportional to n.
Let's try another example, due to Joel Spolsky:
Schlemiel the Painter wants to paint a line on a road. Unfortunately, he's not very bright, and he leaves the paint tin at the start of the line, then walks back to it each time his brush runs out of paint. Let's suppose he paints one metre per brushstroke, taking one second, and walks at one metre per second. How long does it take him to paint 100m of road surface? On his first trip, he paints the first metre, taking one second, and then spends one second walking back. On his second trip, he paints the second metre, taking one second, spends one second walking there, and two seconds walking back. On his fifteenth trip, he spends fourteen seconds walking there, one second painting, and fifteen seconds walking back. Altogether, he spends 2 + 4 + 6 + ... + 200 seconds, which, as it happens, is equal to 200^2 + 200 = 40,200 seconds, or about 11 hours. In general, if he paints n metres of road surface, he'll take n^2 + n seconds. We say that the Schlemiel the Painter algorithm is O(n^2), which means that the time taken to paint n metres of road surface is roughly proportional to n squared: if n is large, the "+ n" bit of the expression is utterly dwarfed by the "n^2" bit.
Some important - or at least common - algorithms are O(n^2), but if you can find an algorithm with a lower complexity, then it's generally a good idea to use it. In the case of sorting, merge sort is O(n log n), which means that the time taken to sort n items is roughly proportional to n times the logarithm of n.
Now, let's look at your problem (and I really, really hope this isn't homework). Your program takes 1 second to sort 1000 records, and the time it takes to sort is roughly proportional to the square of the number of records (which is what O(n^2) means). Let's say that it takes k*n^2 seconds to sort n records, where k is a number we have to determine. Then 1 = k*(1000^2), so k = 1/(1000^2). To sort 3000 records will take approximately k*(3000^2) seconds, which is equal to 1/(1000^2) * (3000^2) = (3000^2)/(1000^2) seconds. Cancel out the factor of 1000^2 on top and bottom, and we're left with 3^2 = 9 seconds.
There's a Polish restaurant on Leith Walk called BigOs, after the Polish national dish. Sadly, it's just a bit too far from the Computer Science department to become the default post-talk restaurant.
Thank you for this! It both confirmed what I'd understood about Big-O notation and gave me a new recipe to try :-)
I've oversimplified, of course: in particular, I implied that (a) an algorithm has one complexity class (whereas in reality its average-case, best-case and worst-case classes may all differ), and (b) that it's only about speed, whereas actually one can use Big-O notation to talk about memory usage and so on.
There's a more formal definition of what it means for an algorithm to be O(f(n)) for some f, but I don't think it sheds much further light.
And yes, bigos is very tasty.
As someone who is about to become a software developer, this was very interesting!
Software developer eh? Do tell, do tell :)
I work for MetaSwitch (you may recognise them as Data Connection) now. It's all very exciting!
Excellent - I had it correct in my brain too. My thesis stands ;).
Of course this doesn't compensate for the fact that in one section I talked about doing a Fourier transform (O(n^2)) versus doing a a DFT (O(n log n)), then squared one million data points to get one billion (10^9) operations - yes; I even wrote 10^9 after one billion, I wasn't just getting confused between old and new billions.
So, if it takes a labourer half an hour to dig a hole, how long does it take him to dig half a hole?
could you write more about it?