--- Video Title: Asymptotic Operators Description: Theory of Computation https://uvatoc.github.io/week5 10.4: Asymptotic Operators - Intuition behind Big-O, Big-Omega, Big-Theta - Formally Defining the Asymptotic Operators Nathan Brunelle and David Evans University of Virginia --- Next, we're going to be talking about how we can represent things so that the constants don't matter. And that is using this asymptotic notation. Okay, so most of you who have seen this before, it was most likely presented to you wrong. The way that we typically use big O, big Omega, big Theta, the way that we typically use it is to categorize runtimes of programs or something like that. That's probably how you saw it before. You would say something to the effect of this Java program is big O N squared. Java programs are not big O anything. It would be the function representing the number of resources that Java program is using belongs to big O N. But just as a shortcut for this idea, we say the Java program is big O N. So big O, big Omega, big Theta, these are not runtimes. They are sets of functions. So big O, big omega, big theta, they are sets of functions. Sets of functions where these functions map typically real numbers to real numbers. So they're sets of functions that map real numbers to real numbers. So when I talk about, say, big O of some function, say big O, F of n, this says a set of all of the functions that are asymptotically upper bounded So when we say asymptotically, this is talking about the trend as the input gets larger and larger and larger. How does that trend? And how does it trend specifically compared to f? So f is going to have some trend of larger and larger and larger as its inputs get larger. And any function that kind of eventually becomes upper bounded by f belongs to the set big O of f is how big O works. So given a function, when we ask big O of that function, we get as a result some set of functions, which are those that are eventually overtaken by f. When we talk about big omega, that's a similar idea. If I start with some function and I say, give me big omega of that function, I get as a result all of the functions that eventually overtake f. That is, they're lower bounded, so they're eventually going to be above f in this case. So as they trend over time, they're going to grow at some rate, f is going to grow at some rate. Big omega is those that will eventually rise above f. And then big theta is those that they could eventually rise above or potentially rise below, depending on constants. So we're going to do an illustration first of this. So let's say that I have some function g of n. I want to know, does f of n belong to big O of g of n? So does f of n belong to big O g of n? So what we want to know is, can we make it so that eventually f of n becomes upper bounded by g of n within a constant factor, is the idea here. We're not going to care about constants. So basically we're trying to say, can I find some constant, since constants aren't important to me, if two functions are different by a constant, that's not important to me. So what I'm going to do is I'm going to try to find some constant fudge factor. So if I have this function g, then maybe what I could do is I could sort of tilt the function or shift it up and down or something like that. So if I can sort of tilt the way the function is going and then shift it up and down as well, is there some amount of tilting and shifting up and down that I can do? So that eventually, so after some amount of time, so eventually, g of n is always going to be above f of n. If there is some way to do that, then we say f of n belongs to big O of g of n. So notice, right here, we have an equal sign. Most people use equal signs for this. Most people say f of n equals big O g of n. So while that is the most common notation for this, it is actually an incorrect notation for this. The more correct notation for this would be f of n belongs to big O of g of n. Element of is more correct, equals to is more common. Not my decision. Okay, so if we can find some tilt or shift to do to g so that eventually it's going to always be above f, then we say f of n belongs to big O g of n. On the other hand, if there is some way that I can sort of tilt and shift g so that eventually g will forever be underneath f, that means f of n belongs to big O mega g of n. If I can do both, potentially for different amount of shifting and so forth, shifting and tilting, so if I can do both for different amounts of shifting and tilting, then I conclude that f of n is big theta of g of n. So big O is essentially, this essentially means asymptotically less than or equal to. You can think of big omega as saying asymptotically greater than or equal to. And then you can think of big theta as asymptotically equal to. And when we say big O is asymptotically less than or equal to, that means with some fudge factor, it's eventually always smaller. When we say greater than or equal to for big omega, that says with some fudge factor, we're eventually always bigger. And if we say theta exactly equal to, that means you can make both happen. Okay, so let's look at the formal definitions. The formal definition for what big O, G of N means, that means the set of all functions that are at most within some constant of G for sufficiently large N. So in particular, that means it's the set of all functions where I can find constants C and N naught both greater than zero. So two positive constants, C and N naught, both greater than zero. Such that for any number that's larger than N naught, I can say that F of N is less than or equal to the constant times G of N. So this constant, that's essentially that fudge factor. That's me tilting the function. This for all N greater than N naught, that's kind of how we're expressing eventually. So it says we're going to pick some position where once we get past that early noise in that position, then the trend is going to be forever after that C times G is greater. For big Omega, we have exactly the same thing, except we switched the inequality. We said greater than or equal to instead of less than or equal to. That's exactly the same. Big Theta is the intersection of the two. So something that satisfies both definitions is Big Theta.