You are currently viewing Big O: How Code Slows as Data Grows

Big O: How Code Slows as Data Grows

Big O notation is a computer science technique for analyzing how code performs as data gets larger. It’s a very handy tool for the working programmer, but it’s often shrouded in off-putting mathematics.

In this talk, I’ll teach you what you need to know about Big-O, and how to use it to keep your programs running well. Big-O helps you choose the data structures and algorithms that will let your code work efficiently even on large data sets.

You can understand Big-O even if you aren’t a theoretical computer science math nerd. Big-O isn’t as mystical as it appears. It’s wrapped in mathematical trappings, but doesn’t have to be more than a common-sense assessment of how your code will behave.

Talk given by Ned Batchelder at PyCon 2018.

Thanks to PyCon for giving us permission to post this talk. freeCodeCamp is not associated with this talk. We’re just excited to bring more exposure to to it!

Learn to code for free and get a developer job:

Read hundreds of articles on programming:

And subscribe for new videos on technology every day:


This Post Has 32 Comments

  1. Boris the Blade

    superb talk this guy is a legend. its amazing how some people overcomplicate things when its really simple. Im not sure if they do it cuz they like thinking they are so great or smarter than other people or they just dont know any better. anyways 30 mjnutes well spent

  2. Ivan S.

    This is simply the best video explaining what Big O is that I've managed to find. Great talk!

  3. 7:57 "mathematcal detritus" and "chaft". I hope your aware we rigorously define and understand these concepts for our own enlightenment. Theres definitely a place for people who can give an intuition about it, like yourself. But don't bash us for trying to understand it completely.

  4. MM


    thank you.

  5. iamserda

    wait a minute, even though the end result would have been the same, isn't it (using average instead of worst-case) 3N/2 + 3N/2 + 1. the only think happening 1 is one of the returns(mom_name or None). But the if statement occurs 3x, at each iteration. so technically, we end up with 3n/2 + 3n/2 + 1 = (6N + 2) / 2 = 3N +1 which leads to 3N(discard lowest terms) and eventually N(discard coefficient) .

    I could be wrong. The end result is the same. I just think there is an issue with the explaination of the if statement as if it does not occur at each iteration but it does, it just happens to be false 2 out 3 times if the item you are looking for is at the end of this array. But that does not mean the if statement wasn't executed 3x.

    just my 2 cents.

  6. iamserda

    Oh, this is TERRIFIC. I am learning this to prep for upcoming interviews in tech and for a guy without a background in pure math and who took his last math course 13 years ago, I NEEDED THIS.

  7. nikoladd

    O(N^2) is NOT one of those terrible things you try to avoid. It's the lowest power polynomial(apart from N^1 of course). It's one of the things you try not to run too much, but as far as algorithmic complexity goes it's pretty good.
    Algorithms of complexity O(N^3) are a dream for many real world tasks… like finding most points in space that are on the same line. ..and this is N times worse then O(N^2) for those that didn't get it.

  8. The Journeyman

    I salute you, sir. You have done a great job of simplifying a topic that is oft over complicated.

  9. Mujeer Ahmed

    The bean example was very helpful. Every time I think of BigO, the bean example is going to cross my mind. Thank you!

  10. greenie62

    well, we know who at least one of the people are who clicked the dislike. i think we can get a O(log(n)) divide n conquer to figure out the other 3

  11. Chandan Tk

    I had trouble understanding big o but after this video, I really understood each topic, really helpful and quality teaching from freecodecamp 👍.

  12. OJAS

    G.H. Hardy introduces Big – O, Little – o, and tilde ~ comparators in his Pure Mathematics, and Number Theory Texts. Wkipedia has a page on these.

  13. Dev H. Kim

    with the number of comments is just only 31 i’d say i found a hidden gem. this is what i was looking for! thanks for sharing!

  14. Ozzy Explains

    Excellent video! It's interesting how when building blockchains and stuff (think proof of work), higher time complexity is better for building secure blocks, because it means it takes time for adversaries to attack the data structure. O(.) is important, but its interpretation depends on context 🙂

  15. Alex

    Great talk. This guy really has a bone to pick with math.

  16. LaloysTV

    Thanks for the knowledge you have shared. Amazing!

Leave a Reply