--- Video Title: Introducing Complexity Description: Theory of Computation https://uvatoc.github.io/week5 10.1: Introducing Complexity Nathan Brunelle and David Evans University of Virginia --- Last time, basically, what we talked about was how we could build one circuit that could do other programs. That was basically what we talked about last time, being able to have essentially reprogrammable circuits. The thing that we're able to conclude from this, we knew that we could do any function with finite circuits. So maybe what we want to do, since any function at all is kind of in the realm of possibility, maybe what we want to do is pick the function to implement that doesn't force us to make a decision. So we want to pick the most general function we can to keep our options open. So what we chose was we wanted this function that could evaluate other programs. This leaves these questions of how expensive are these functions that we might want to compute? How big does our computer need to be depending on the function we might want to implement? Some functions pretty clearly are more expensive than others to implement. When we looked at NAND circuits, NOT was a really easy function to implement with NAND, but maybe UPDATE was a pretty difficult one, was a much more difficult one. So certainly, intuitively, it seems like some functions are more complicated than others. Some functions are going to be more expensive than others. So we're going to talk about that. How complicated can they get? How big could our implementations actually get? Where's the bound on that? Let's say that I wanted to build some circuit that could evaluate any program at all of a function mapping in input bits to one output bit. How big would that eval circuit need to be? We're going to be able to answer questions like this much more easily today than we've been able to answer them before, after we cover this material. And the way we're going to do this is we're actually today getting into something called complexity. So we mentioned that there are going to be two major themes of this course. We're going to be looking at computability, and we're going to be looking at complexity. Computability says, what can I do with a certain model of computing? Complexity says, how difficult is it? Or how many resources are going to be required to do that thing under that model of computing? Complexity is essentially going to be a property of a function. If I'm talking about a function, I can talk about the complexity of that function, which is the measure of the resources required to compute that function. So some functions are simpler functions than others. And the measure for how difficult our function is going to be is essentially going to be what the resources were that we required in order to do it. And we call that measure the complexity. So functions that have a higher complexity, we require more resources to do that. And in this case, we're talking about the requirement. What is actually the number that are strictly needed? The number of that resource that are strictly needed. For our NAND circuits, the measure of complexity that we're going to use is going to be the number of gates. Functions that require more gates are going to be more complicated functions for this model of computing. In general, a trend that you're going to be seeing with regards to complexity is that whenever we're talking about complexity, we're always counting something. So we're always going to be counting something. So in this case, we're counting gates. For other models of computing, we're going to be counting something, maybe not gates, but we're going to be counting some other thing instead. The measure of the resources, this is always going to be a count of something. So to measure the complexity of a function, we say, of all of the ways that we could potentially implement this function in our model of computing, what is the minimum count of gates that we need in this case? What is the minimum count of the resources that we need in order to do that? What we're going to do, something that computer scientists really love to do with functions, we like to categorize functions by their complexity. So you like to arrange functions into groups such that all of those functions have a similar complexity. They're all similarly difficult to solve. And once we can sort of arrange these functions by complexity so that we have groups of functions of similar difficulty, that gives us a lot of tools, a lot of power for being able to compare those to other functions to figure out what classes they belong into, and how efficient our implementations of various functions could be. This is why we like to group these functions. And we typically call one grouping of functions by their complexity a complexity class. So a complexity class is a set of functions. That is, the type of a complexity class, is a set of functions. And we basically decide which functions belong to that set versus don't belong to that set based on their complexity. So this is the overall idea for how complexity is going to work. We look at functions. We say, what are the number of resources required to compute that? And what category do they fit into? This is what complexity looks like. Questions about that. That distinction doesn't as much make sense when we're looking at, like, the circuit model of computing. But, for instance, when we're going to be looking at something more like how Python or Java work or something like that, there are different resources that we could measure the function's complexity relative to. So, for instance, we could say, what's the smallest amount of time that might be required in order to implement this function? So time could be one of these resources that we want to categorize functions by or measure complexity relative to. We could also say, how much memory? What's the minimum amount of memory that you need in order to do that? So that's just a separate one. You could also say, if I wanted to compute some function remotely, I have two parties that are on opposite sides of the country, and they each have their own inputs, and they're trying to get together to compute some function between them. You could say that your resource is the amount of communication they have to do, the number of bits that they have to transmit back and forth to be able to compute their function, something like that. Virtually anything that you could think of that's kind of an expensive resource or something that costs money to do with computing, you can consider that one of these things to count for complexity. Other questions? Yeah, so for our circuits, the only thing we're going to consider for complexity is the number of gates. Once we start looking at other models of computing that have kind of more components to them, then we're going to be thinking about other measures as well. Yeah, so that's a very good point. So why is it that we don't care about what kind of gate it is? So the idea is essentially that if you cared about AND gates or OR gates, or AND gates being more expensive than NOT gates, or something like that, let's say, then what you could do is you could consider a slightly different circuit-based model of computation instead that's going to be more in line with what you're actually trying to use the theory for, what you're actually trying to apply the theory to. So for instance, oftentimes when we build hardware, when we actually deposit the transistors onto silicon and that sort of thing, many times we use exclusively NAND gates. So maybe in that situation, you want to consider a NAND circuit model of computing. And in that case, since you're counting NAND gates as your number of gates, you will have, say, AND gates are more expensive than. NOT gates, just as a side effect of that decision. If you're implementing things in a way that's maybe a little bit different from that, where it might be more realistic to consider AND or NOT as equivalently efficient. Or maybe you could consider, I actually want to care about the transistors. So instead of gates, you have transistors. So we actually modify what circuits look like a little bit to be focused on transistors instead. Other questions? That's a very good question. Thank you for that.