--- Video Title: First Complexity Proof Description: Theory of Computation https://uvatoc.github.io/week5 10.3: First Complexity Proof - Proving Relationships between Complexity Classes - Why we want to ignore constant factors Nathan Brunelle and David Evans University of Virginia --- We already know enough where we can start to prove things about complexity classes. So I'm going to say that size superscript AON of s is kind of like the size s that we saw before, the set of all functions that require at most s gates to implement. But size without the AON, that meant NAND gates are what we were counting. I'm going to say that with this superscript AON, we are going to be counting the number of and or not gates that are required. This theorem down here, this is basically saying that anything that I can compute, any function that I can compute in s over 2 NAND gates, I can definitely compute in s and or not gates. And anything I can compute in s and or not gates, I can certainly compute in 3s NAND gates. Let's do the right-hand pair to start with. So how do I know that this inequality is going to hold? How do I know that anything I can do in s and or not gates, I can definitely do in 3 times s NAND gates. So we mentioned that we could take NAND gates and build ands, ors, and nots out of those NAND gates. And when we looked at that construction, when we said, how do we build and out of NAND? How do we build or out of NAND? How do we build not out of NAND? The most expensive of those required 3 gates. So in the worst case, when we're looking at all of the s gates, all of the s and or not gates that we used here, if all of them were sort of... what was it? Was OR the most expensive one? Yeah, so if all of them... if we were just so unlucky that all of them were ORs, then when I converted that using that technique to a NAND circuit, then I had a 3 times explosion in how many gates I had. I'd have triple the number of gates I had before. So that means that certainly anything I could do in s and or not gates, I'd be able to do in 3s NAND gates, because that conversion said that I would end up with 3 times as many gates. Yes. Yeah, so it really does need to be an integer. So this should be... we'll do ceiling. We're going to do ceiling here. That's a good point. I should have had ceiling. If you can demonstrate how it is that anything I can implement in s over 2 NAND gates, I can also implement in s and or not gates. All right, so why is this the case? Why is it that anything I can do in s over 2. NAND gates, I can do with s and or not gates? Yeah. Yeah, so I could represent each NAND gate. So if I have a NAND gate, I can just convert that into an AND gate and a NOT gate, Ending up with double as many gates as I had before. So anything I could do with S over 2 NAND gates, I can do with S and or NOT gates. Yeah, so we've shown containment of these complexity classes in this case. Typically, the way that we measure these complexity classes, we could measure them by this specific count. But usually the way that we do this in practice is because of this aspect where kind of changing by a constant is the difference between these very, very similar models of computing. So these were very, very similar models of computing, NAND and AND or NOT. And the complexity of functions was differing by at most a constant among these models of computing. So anything that I could do in AND or NOT with some number of gates is going to be at most some constant factor different if I were to convert that to NAND gates or vice versa. So in order to transform implementations of functions among these models of computing, I only had constant differences in how efficient or inefficient that might be. So because we have only these constant changes, if we don't want to have to think too hard about the model of computing that we're doing, since we're only going to have these constant changes, then oftentimes what we like to do in computer science is just say, ah, the constants don't matter. Transforming between very, very similar models of computing, typically that only results in constant differences in the efficiency of the functions that you're trying to implement. And so we oftentimes just ignore the constants. That way we could say something like, if this function requires N squared AND or. NOT gates, if I'm ignoring the constants, then also it requires N squared NAND gates, because those were at most a constant different from one another. That means that if we're ignoring constants, we can just say this function requires N squared gates and not even have to worry about what kinds of gates they are. To ha voyez Holds up to that line to see back heed User keep Here to Now mess