All right. What's going on here? Yeah. All right. Good morning. Thank you so much all. All right. Let's get started. Oh yeah. Share the board. Share your screen. Have the notes here. What day today? We are the 18th. Okay, good. All right. So we're now starting the discussion of CARP's paper. The general problem that he was considering or he introduced was what's called the P equal NP problem. The question being that the role of determinism and non-determinism in the time complexity, how long it takes for an algorithm to run. So the P, the letter P represents all possible algorithms that take a polynomial amount of time to run. And the NP, the NP, non-deterministic polynomial, means if you had a non-deterministic machine, then all the amount of time, all the algorithms that would take a polynomial amount of time to run on the non-deterministic machine. So, Karp Cook introduced this, the general concept, by showing that any, what's called, we once defined the definition of hard. Well, so, there was a question and complexity, the difference between a hard and an easy problem. We mentioned that colloquially, Jack Edmonds suggested the difference between polynomial and exponential, that he felt that polynomial was a relatively easy algorithm and an exponential, something that takes an exponential amount of time is a hard algorithm. Now, even though that's called easy and hard, generally, you know, it's colloquially, that, and there's some truth to that. And that's what inspired the discussion and the research by Cook and Karp. The official definition of hard, the term hard is not used in that way. The term when we say MP hard, or, you know, it's for complexity class hard, it means that you're as hard as any problem in that class. So, the first problem on line four I mentioned, the first problem that was shown to be NP hard, which in this context means as hard as any problem that's in the NP class. One more time, NP means all algorithms that take a polynomial amount of time on a non-deterministic computer. And the central algorithm that's as hard as all those other algorithms is the notion of satisfiability. What is satisfiability? You have a Boolean predicate. You have a function. And that function accepts... Well, again, we mentioned that the notion of a Boolean predicate, most discrete math books will say that the inputs could be any value. The output, the key is the output. The output's zero or one. But in the case of Cook and Karp, the inputs were also zeros and ones. So, you have a Boolean predicate where the, you know, you have x1 and x2 or x3 not, you know, etc. So, it's ands, ors and nots. Boolean predicate involving x sub i, the variables. And satisfiability would mean finding a sequence of zeros and ones such that when the x sub i have those values that your Boolean predicate p would be true. So, that's called satisfying the Boolean predicate. And Cook proved in 1971 that this is, that all MPHard problems must reduce to satisfiability. All problems at its core, all MPHard problems at its core, is the notion of satisfiability. Why that's so, we will prove later in the semester. You know, we will spend a day going through his proof. The proof is interesting, but it's not something we will gain too much practical knowledge from, but the result is important. Anyway, that's, so that's the notion of Cook. So, the key we mentioned last time is that the notion of reduction to Cook, he only had to consider reducing one problem to satisfiability. So, when Cook defined the word reduction, it was in one direction. A reduces to B. That's it. That's it. Reduction. A is as hard as B. A reduces to B. Okay? But, CARP is going to now show hardness, MPHard. CARP requires that reduction occurs between the two problems. So, it's not just that A reduces to B. B has to reduce to A as well. So, CARP requires reduction in two directions. Okay? But, in his paper that we're about to do, we're going to talk about reduction, the definition of reduction more formally in a moment. And then we'll go into his problem, the 21 problems that is in CARP's paper. Again, it's really CARP 20, because one of the problems that's in CARP's paper is Cook's problem. But we still call it CARP 21, even though he only did 20. So, that CARP requires two directions. But in his paper, he's only going to show one direction. You know, it's reduction. The answer is because CARP is going to use satisfiability as the other problem. CARP is interested in showing what satisfiability reduces to. Cook said that every MPHard problem reduces to satisfiability. Now, CARP is going to show that... CARP was interested in the other direction. Satisfiability reduces to whom? Now, in doing so, using Cook's fact that he already showed the other direction, that then CARP is completing it and showing that the satisfiability goes back and reduces to that other problem. So, CARP is doing the two directions. It's just that one of those two directions, Cook already proved. Then we mentioned Levin, which we'll just... again, we're not going to go into. It's just for the completion sake of the notion of the definition of reduction. So, Levin doesn't deal... Levin's notion of reduction deals with a totally different perspective. Levin just observes that in order to solve a problem, you'll need... usually it means you're missing a piece of information. If only I remembered this. If only I knew this. Right? So, in Cook's, in Levin's mind, the complexity of a problem sort of reduces. .. boils down to what is the cost of getting that piece of information you need. Right? And that's because then you could solve your problem. So, Levin deals more with that extra piece of information that you need. And the complexity is the cost of obtaining that information. And that's what he calls... you know, it's the notion of reduction is the... for complexity... is dealing with the cost finding, in essence, the missing piece, the piece of information that's missing. All right, that's a running start to get us to today. These concepts we have in the previous two notes. All right, in terms of the... the notation, when we talk about... when we talk about... reduction. So, they use... B less than or equal to A. That's the notion of B... B reduces to A. Why less than or equal? So, let me make sure that I do have it. I may have spared you. Let me give you a second. Ha! Interesting. Okay. One second. Let me give you a second. All right. I'm gonna... for... just so we could jumpstart the CARPS paper, I'm going to do a shortened version of reduction. There is an explanation why... why the notation is B less than or equal to A. And I will do... I will explain that. It takes a bunch of time, but I want to get us started with CARPS paper, which, by the way, hopefully you download it. Give me one second. Give me one second. Let's see here. I thought you'd pop it up for you. Give me one second. Sorry. Let's see. Now, I sent... posted on Brightspace... I hope I emailed you, but either way, you have to search. It's called Reducibility Among Combinatorial Problems. Let me just see if PDFs could be shared here. Let me stop sharing for one second. Let me see if I could share the PDF just to show you... Here. Oh, yeah. I can. Let's see how that goes. Yes. Okay. Good. So, we can. All right. So, this is CARPS paper, Reducibility Among Combinatorial Problems. The initial part is the definite long... well, the definitions, a number of which we've done already. But we are going to deal with... Give me one second. Okay. Yeah. Oh, yeah. Well, this... So, let me tell you a little bit of a sort of a map to the paper. On pages... You could put this down in your notes if you want, but we'll get back to it. But on pages 89 and 90 of CARPS paper, CARPS lists a number of different problems, authors, and the years they were published. Those problems are not the CARPS 21. Those problems are in P. In other words, for each of the situations, the problems that he has here on pages 89 and 90, he wants to give the reader a little taste of, at that point of time, 1972, what was known to be in polynomial time. Normal, regular, deterministic algorithms. He picks very interesting cases, and I'll explain at a different time why he chose these algorithms. The hint which I'll give you in case you're reading over the paper throughout our discussions over the next bunch of lectures, which is when we have our lectures, which is that he chooses problems that are similar to the problems that he's going to prove are NP complete. Well, NP complete, NP hard. Okay? He's going to show that he's going to... because he's one of the sub-themes he has here is that, you know what? These problems are somewhat similar. And all I did, says CARPS, was tweak it, change it a little bit. But guess what? Changing some of these problems a little bit, all of a sudden, becomes NP hard. So, if you want to make yourself familiar to read these problems, 89, 90, we will go through them a different time. Not now, but as we're going through the CARP 21, I will refer back and say, well, CARP 21, this problem is NP hard, but the cousin is actually only in P. Now, it's a problem similar, but not exactly similar, but changed slightly. So, one problem could be in P, and the other similar problem, but yet changed a little bit, will be in NP. All right? So, I'll be pointing those out. All right. Then, here on page 892, starts CARPS 21 problems. Well, no. Sorry. It is the proof by Cook. Cook, the satisfiability problem, which we mentioned. He writes the problems in a mathematical manner. I will, of course, translate the problems into a more regular, normal, colloquial way of saying them. But then, he proves on page 9293, Cook's satisfiability theorem, Cook's problem. And then, we will have here, we'll ignore the definition complete, but here we go. On page 94, we now have the problems that the CARP 21, which he lists. In the middle of listing 21, the next page, before he finishes number 20, before he gets to number 20, he then has a tree. And this tree is the tree of all his 21 problems. And if you note, at the root of that tree is satisfiability. Okay? One more time. One more time. Because, Cook, and he has this above in the paper. I showed you on page 91, 92, whatever that was. That's in 92, 93. Whatever. It's Cook's theorem. A summary of Cook's theorem. Cook showed that every hard problem, including the 21, 20, 21 problems that he, that CARP is going to focus on, all of those problems reduce to satisfiability. CARP is then going to show that satisfiability, in turn, reduces to them. That's what CARP's going to show. The CARP picked satisfiability problem, again, is from Cook. CARP chooses 20 other problems. And he shows how to prove that a reduction occurs from satisfiability. Now, in order to show that, satisfiability, and please understand this, the actual definition of reduction is only one level, one at a time. Meaning, here, let's take... Okay, fine. So, satisfiability... You have to turn your heads a little bit to your left. But anyway, satisfiability goes to click, clique. We'll define what that problem is. But satisfiability reduces to click. And then click reduces to node cover. And then node cover reduces directed Hamiltonian circuit. Directed Hamiltonian circuit reduces to undirected Hamiltonian circuit. Okay? So, in doing so, it's as if satisfiability reduces to undirected Hamiltonian circuit. But, from a definition point of view, realize that's not what happens. Satisfiability isn't really reducing to undirected Hamiltonian circuit. We colloquially say that. What's actually happening is that unsatisfiability reduces to click. That's it. Click reduces to no cover. No cover reduces to directed Hamiltonian circuit. Directed Hamiltonian circuit reduces to undirected Hamiltonian circuit. So please understand, especially for definitional purposes and especially for exams, the official definition of reduction isn't that satisfiability reduces to undirected Hamiltonian circuit, although indirectly we have shown that. But directly, all we've shown is satisfiability reduces to click. You have to go one level at a time. Anyway, so keep this in mind, and I will refer to which problem? Let me check here. In two seconds, I'm going to explain what reduction is, and I'm going to refer to a specific problem. And it happens to be problem 12 and 13. Okay, let me put that up just for two seconds, so we don't have to keep turning our left hand here. So satisfiability reduces to another version of satisfiability. Ignore that for the moment. That, what's called SAT-3, reduces to chromatic number. And chromatic number, for example, reduces to click cover. So chromatic number is number 12, it turns out. Click cover is number 13. Okay? Keep that in mind. I'm going to refer to that little piece over here in the center of the center. Chromatic number reducing to click cover. I'm going to refer to that in our discussions of the terminology of reduction. Okay, so let's get back to the notes. I have to share it again. One second. Open this up. Let's share it. Where are we? Here we are. Okay, so let me share again our notes and now keep that in mind. So number 12 and 13 that we just went through. Just show. All right. So again, the terminology of... The terminology of reduction is written as B less than or equal to A. Again, why it's written less than or equal, I'll discuss with you a different time. So that's the notation. And it reads B reduced to A. But realize that there's a lot of details in there. And there are actually eight details. So I'm going to tell you abstractly. And then I'm going to give you the example using chromatic number and click cover. The problem is 12 and 13. Okay? Which I'm going to refer to in a second. Anyway, the eight details are as follows. In the case of reduction, when you want to prove that a problem that you didn't know about, that you have never studied before, or even if you've studied it, you don't know exactly how hard the problem is. And you want to use reduction to show that this new problem is hard. So the way it works is that you have to use another hard problem. And show that it reduces to the unknown problem. So again, the case is the following. A, again, line 21 says B reduces to A. That, in the context of CARP's paper, will always refer to the following. Okay. Number one. A. So A is unknown... Well, the backdrop is A is unknown to be hard. And that's...so A is unknown to be hard. But B is known to be hard. Okay? So we have a hard problem, B. An unknown problem, A. Unknown in the sense we don't know if it's hard or not. Okay? So we have a known problem that's known to be hard, B. We have a problem that we can define the problem, but we don't know if A is hard or not. And you want to prove that A is as hard as B. So you have to do B reduces to A. So again, on line 25, this...the assumption of the terminology assumes that you know B is hard. And you don't yet know that A is hard. And you're using reduction to show that A is also hard. Okay, number two. And this is a little bit funny in the wording. Not in the wording on line 26, but the wording in that we say B reduces to A. Because...well, we'll just say it. B reduces to A on line 26 actually means that A solves B's problem. Okay? A solves B's problem. So that may sound a little funny. Because in the from and the to, as I'm going to do in lines 31 and 32, you reduce from B and you reduce to A. So it sounds like the direction is from B to A. And while that is the direction of the term reduction, the actual action is performed by A. A is going to solve B's problem. All right? So keep that in mind. It's a little bit counterintuitive, but that's the way the notation and the mechanism must work. It must be that A solves B's problem. All right? That's the second detail on line 26. Third detail, line 27. B is the parent considered in the reduction tree. So remember the tree I showed from the PDF file, which is page 96. It's not that the paper is 96, nor is the PDF 96 pages. It's that the page that happens to have a 96 in the corner in the CARPS paper is a tree. And that tree has all reductions that are proven in CARPS paper. Now, we say, since it's called reduction, we say B reduces to A. So the question is, if B reduces to A, so then, you know, who's the parent and who's the child in the tree? So B actually is the parent. Okay? That's number three. And number four, A is the child. Okay? So B is the parent. A is the child in CARPS tree. And based on the definition of reduction, B reduces to A. So the parent reduces to the child. Okay? And based on the fact on number two, which is line 26, that A solves B's problem, we have on line six, we have number six, detail six, line 30. Therefore, since B is the parent, A is the child. And since A solves B's problem, it's the child that solves the parent's problem. Right? Detail number two says that A solves B's problem. Even though we say that B reduces to A, the reduction is from B to A, as I point out the terminology on line seven and eight. The terminology is B, detail seven and detail eight on lines 31, 32. B reduces to A, but yet A reduces from B. Your B reduces to A, but A reduces from B. So it's from B to A. Okay? You're reducing from problem B to problem A. Okay? And that means that the child, A, solves the parent's problem, the B, the parent's problem. And then on detail five, which I could put anywhere, but I put it there, I don't know. Detail five, line 29, the root of the entire tree is satisfiability, SAT in short. And that's carp's number one problem, and that's the root of the tree. All right. Okay, deal a little bit. Oh, yeah. Now, another notation, instead of less than or equal, is OC, it happens to be those letters, connected. There's no space between them. So some books will write B, small OC, where the OC is, it's not an acronym for O and C. It's the letters they used, Y, it's a different reason. Ignore it. But that was an icon they used in some literature, OC, where there's no space between them. It's not the acronym for OC. It has nothing to do with any words OC. It's just, that's what they used. Anyway, so you may see an O and a C where it's connected, instead of less than or equal, as the icon for notation. All right? That's what I write in 34, 35. And finally, in many books, we will have a footnote. And what the emphasis on that footnote, again, when I give you a more formal theory presentation of B, less than or equal to A, right now I want us to jumpstart Karp's paper. But it won't be a long discussion, but it may be a half a lecture to explain the notation why they use less than or equal. All right? Why they use OC, I don't know. Well, yeah. Let's ignore that. But why they use less than or equal is purposeful. And, but it needs an explanation. Anyway, sometimes after writing B less than or equal to A, they'll write B less than or equal. And then there'll be a very small, in the context of our paper, a very small P right before the A, right? So on line 21, it's B less than or equal to A. In many books, they'll write B less or equal, and then put a very small P, and then write A. Okay, why is that? So, so we'll have to elaborate on that more. That P means that something takes a polynomial amount of time. To remind you about the complexity of NP-hard, which has to do with P. There's a deeper reason, but we can't, we won't go through it now. But anyway, so on line 37, there's a sort of a small, very often they put a small P. All right, so just those are the notation issues. Now, let's take an example of CARP's paper and then apply this terminology and explain what it means. So, for example, if we have problem B, remember we had chromatic number, and chromatic number reduces to click cover. And I showed you that in the tree, which is CARP number, it turns out to be CARP number 13. All right, and using our terminology, we are going to have the following. So, problem one, the B, is CARP 12. Okay, so at that point, chromatic number is known to be hard. Okay, at this point, chromatic number is known to be hard, but click cover is not yet known. So, and CARP is going to prove that click cover is as hard. And that, he's using the process called reduction. Okay, I'm just going to take a drink of water. Okay, now, again, emphasizing what we said before, click cover is A, and chromatic number is B. That got a little clipped. Good. Click cover is A, and chromatic number is B. Okay? And we say B reduces to A. Okay, next. In the CARP's paper, chromatic number is the parent, and click cover is the child. The root of the tree is satisfiability. Again, these points one through eight are these points one through eight. But instead of saying B and A, I'm giving specific problems. B is chromatic number. A is click cover. Now, what we said was that the child, again, so on line 46, detail four, click cover is the child, right? Click cover is the child. But we said above that it's the child that solves the parent's problem. Detail six. So over here, detail six, click cover solves chromatic number. Click cover is the child. We don't yet know that click cover is... we do not yet know that click cover is hard. But we know chromatic number is. But we're able to show that click cover can help solve chromatic number. The terminology overall is that chromatic number on line 49, detail seven. That chromatic number reduces to click cover. And another terminology, if you use the word from, be careful, which who is from whom. Click cover reduces from chromatic number. So understand the direction. It's only one direction. Again, the irony is that CARP actually requires two directions for reduction. The directions are between the two problems. But CARP only will show one direction. So it's important that you use the terminology correctly and know which problem is the from and which problem is the to. Okay? And the reason being, but we just said that it's two directions. The answer is because he's already shown indirectly that the... not him. COOK has already shown the other direction. So ignore that for now. All right? But just keep that in mind. That CARP is actually in this paper only showing one direction. All right? So we should really have a detail nine, which is conclusion. A conclusion is that A is as hard as B. Well, okay. I'm sorry. I should have put that above. So in our context, but we need it in both places. In our context, we therefore show a conclusion. A, A is click cover. So click cover is as hard as chromatic number. Now, again, this is, you have to be a little careful when doing this, because the only reason this works is, okay, so we should add that here too. Note, a very important note, note, remember that for CARP, this is only based on, on the fact that the other direction was already shown, was already provided by, was already provided by COOK. COOK. All right? COOK already showed the other direction. Okay? CARP is showing one direction because the other direction was provided by, or based on COOK. It was already provided by COOK's analysis. That's COOK's analysis. All right? So that's why you could say conclusion. In other words, if we only actually did the reduction as is, in one direction, CARP would not agree to number nine. I guess, no. We have to leave it here. Okay? Please understand that. In other words, CARP, if all we knew, if we didn't know COOK, all right? If we only knew CARP, and we did not know COOK, so then CARP would not agree to number nine. It's only because he knows that the other direction is there already, based on COOK's analysis. COOK would say that alone. So understand that. That's a funny thing. But this reduction and the conclusion, COOK, on his own, would say it's as hard. But CARP would not yet say it, unless you knew that they're both equal footing as hard. That A is as hard as B, and B is as hard as A. So it's a little bit of an irony. COOK would say, without even knowing COOK's analysis, COOK's definition of reduction would already conclude, that A is as hard as B. CARP is only willing to say that because of COOK's analysis. And therefore, the other direction is already provided. Okay? Let's see. Note at end of number nine above. All right. Anyway, fine. Enough said. So keep this in mind, these details. So when we go through these problems, please remember, again, the parent problem is known to be hard at the point of the tree, and the child problem is not yet known to be hard. And CARP wants to show by reduction that the child problem is as hard as the parent's problem. And the way it works is that the child solves the parent's problem. Okay? All right? Again, why it works this way, I'll do more formally with you a different time. And it's actually related to why they use less than or equal. But ignore that for now. Let's jump into some of these problems. So again, CARP is 21 problems. And historically, one of those problems was actually shown by Cook. And CARP shows 20 other problems to be NP-hard as well. Now, if you notice, I'm using the term NP-complete as opposed to NP-hard. CARP's problems are NP-complete. Okay? So, we mentioned this before, but worthwhile to repeat and remind you of this paragraph that you have here on lines 55 to line 60. Maybe I should move it over because it's all a discussion of this point. Complete means that you can verify the solution... Well, let me say it differently. Give me one second. That's a little funny. But in essence... Okay, in essence... Let's say it this way. In essence, complete will mean to CARP that you could verify the solution in a reasonable amount of time, which in our case is P. Okay. Now, the issue is that a lot of... We mentioned this in the first week... Well, not first week. A week or two ago. That a lot of CARP's problems are optimization problems. And we made the distinction between optimization and decision problems in the following manner. If you have a formula, f of x, right? And you want to know... And you want to solve it, or a system of equations... You have a system of equations, and you want to solve the x's, right? And you have n unknowns, x1 through xn. And you want to solve system of equations. To solve it means you find x1 through xn, the values, such that every constraint, every equation, will be satisfied, in essence, will be true. So you have 5x1 plus 6x2 equals 7, blah, blah, blah. Right? So now you've found a bunch of numbers. Now, in the NP-complete sense, in the NP tradition, the non-deterministic computer is going to, as it were, bad word, but is going to guess, but guess correctly all the time. But it's going to guess the x1 through xn for you. Right? Immediately, you ask it, you program it, and you say, which of the following values is x1, x2, x3, blah, blah. Right? And it's going to say, x1 is this, x2 is that, xn is that. Right? So the NP problem gives you those values. Now, you have to verify the solution. So you have to plug it in. Now, so that, in that case, although you're solving a system of equations, in some sense is a decision problem. In the sense that it's a yes, no. Did I, you know, your verification will say, it's solved, it's not solved. Right? It boils down to a yes, no, check off. Check. Yes. Yes. No. Check the box. In the case of optimization, that's not clear. Because if you have some sort of mathematical model of the stock market, and if this NP computer told you, buy the following stock, you know, it's going to be the best for your portfolio. Right? How are you going to verify that? Right? How are you going to verify that? What are you going to do? Well, you're going to, what, plug in, what, every possible number? Yeah, you could do that, right? Plug in every possible stock. But that could be exponential amount of time. They're an exponential amount of stocks. The point is, how are you going to verify? In an optimization problem, you want the best, the max or the min, the least, depending on whatever application you have. So how are you going to ever, how are you going to ever be able to, how are you going to ever be able to verify that the NP, what the NP computer told you? The NP computer told you X1 equals 7, and that X2 equals 5, dot, dot, dot. And that's going to maximize your F. And you plug it in, and it's pretty big, pretty large. F is pretty high, high number. How do you know? Maybe there's a, you know, if X1 were 8, and X2 were 6, it would be even higher. How would you know that? If the NP computer told you the X's, X sub i's, the values, how would you know that you're, that the function that you're trying to solve, or trying to optimize, is actually, is the best? You can't do any better. Right? So you really can't. Right? So that leaves it as hard. Because you can't vary. It's not complete. Optimization problems generally will not be complete. Because you can't verify that the solution is, is actually the best, unless you do an exponential amount of work, which is not reasonable, which we're avoiding. Right? That's the whole point of this. We want to save time. We don't want to do an exponential amount of work. If I, if we knew we had to do an exponential amount of work, then I didn't have to buy this NP computer. Just plug them in and check on your own, sequentially, deterministically, and check them all. Brute force. Right? Exhaustive search, as it's called. Okay. So Clark, though, wanted that all the problems be verifiable, that he's dealing with to be verifiable in a polynomial amount of time. He wanted NP complete problems, not just hard. In order to do that, he had a, but he had a number of optimization problems he was considering. So in order to do that, he had to add an extra parameter. He, he, he augmented the underlying problem in order that the optimization problem becomes a decision problem. Okay? So he had to, or to the, uh, so he had to augment the underlying problem. Augment means to increase, to make larger. He had to make the problem slightly more, more parameters. Can a solution be found, whose value is NP hard, in and at, in an optimization problem, to that of a decision, uh, um, I'm trying to see if the wording is better. In an optimization problem, to become, to become that, I mean, I should say to become that, uh, transformed to become that of a decision problem. So he did that. I don't like the way this is worded. Give me one second. Give me two seconds. Uh, I want to word it slightly better. It's not bad, but it's not as clear as I want. Uh, um, maybe the way I wrote it. Okay. Let me say it a little differently. Uh, it's the same thing. I just want to word it better. Considering, uh, the underlying, uh, NP hard optimization problem. Oh, that's better. Uh, um, uh, uh, uh, uh, uh, um, uh, Uh, uh, there we go. optimization problem. So, considering the underlying, uh, uh, uh, optimization problem, uh, can a solution be found, uh, For a related decision problem, good. A problem transformed by adding a supply... an extra parameter... an extra threshold... parameter... and asking if the solution can be found... less than or greater than the supply parameter. Good. This then is complete. This then... this then is empty complete. Alright, so... Considering the underlying NP-hard optimization problem, can a solution be found for a related decision problem? Okay, so we have an optimization problem. So we're going to make a related decision problem transformed by adding... So we transformed into this problem... by adding an extra threshold parameter and asking... if the solution can be found... less than or equal... less than or greater than this supply parameter. Good. Alright. So that's how he... that's in... in all his problems... if he had an optim... a strictly optimization problem... so then he had to add a parameter... in order that... you could... you could verify it. Because the solution... you'll... you'll... you'll check and see the solution is a big number, right? The solution allows you to optimize in some sense. It looks pretty big... the result. But we can verify if it's less than or equal. So we will make a threshold parameter. Like... we'll... we'll tell the NP-computer... well, I want a solution such that my profit... my output... is greater or equal to a certain value. Those would be equal. Alright, anyway. So that... that's how he... he deals with those problems. Alright, so I give an example. So for example... an example of this... is in... job scheduling number... sequencing number 19. Okay? So this is an interesting problem. We'll talk more about it when we... go through the problems. But just to give you an example where he does this. Where he transforms... the optimization problem... into a... decision problem. So an example of this is in... what's called job scheduling or sequencing. I believe... in one place of the paper... it's called scheduling. In another place... it's called sequencing. Anyway, regardless of whether... CARP uses both terms... the literature certainly does... whatever it's called... job scheduling or job sequencing... number 19... which asks for a permutation... let me see if that's true. Give me two seconds. I'm just curious. I seem to recall... Yes. Yes. Okay. Good. So it is true. One second. I'll tell you where the term... two terms are used. I'm not gonna... You'll notice it. Oh, interesting. Oh, okay. Sorry. I apologize to the car. He's consistent. Everyone else changed it. Okay. Okay. I think the reason they did that... Give him a second. Let me find the original... cousin problem. Oh, no. See, that's interesting. Okay. I apologize to Carp. Okay. Here. I think it's... I know why I did it. I'll explain why I wrote it this way. Carp consistently calls it a sequencing problem. It's just that... I'm the one who had the word scheduling. The reason is that in the... apparently in the early 1970s, the terminology used was sequencing. Because... because the... for the reasons you'll see, it has to do with the permutation. And when we discuss the problem in more detail, practically, it has to deal with the permutation. A permutation is a sequence. A certain order. The fact is that the applications and how people use it is for scheduling. So it's more modernly called scheduling problem. But Carp actually called it the job sequencing problem. And he is consistent throughout the paper. He does call it sequencing. It's just that nowadays called scheduling. All right. Anyway. So what is that problem? So the problem is the following. Let's imagine you're a contractor. Okay? You're going to build a home. I don't know. For example. A building. A home. And you identified all the tasks that you need. Some tasks involve the windows. Some tasks involve the doors. The floors. Depending on if you start from scratch. The frame. You know. All detailed plumbing. Electricity. Lots and lots and lots of tasks. And lots of details. Okay? And you write up a very detailed listing of all the tasks you have to accomplish. And you actually give the amount it's going to cost. Or at least the amount you're going to charge someone. Right? And you have all these tasks and the costs. And the job is the sum... Let's assume it's the... You know, as the contractor gets paid the sum of all those costs. Assuming that the contractor completes all those tasks. Right? All right. Now. There is a... Although that... So as is, that's pretty straightforward. The issue is that... There is a reality check. In that... For example... You know... You can't... Start putting up... Let's say the... The windows till you have floors. I don't... I don't know if that's true in contracting media for safety. No. Safety requirements. Safety laws may require... Certain tasks be done before other tasks. There are practical considerations where one task has to be done before... Before the other tasks. Availability of... Of... Of... The environment or resources... Will require you to do one task before another task. So you have all these internal constraints. All these internal deadlines. Right? But... So now the question is... Can you do... All the tasks by the ultimate... The... The total deadline. Right? But... You know... You have all these... Internal deadlines. This must be done by this date. This has to be done first. This has to... This has to be done second. This is how long it takes. Et cetera, et cetera, et cetera. Um... The question is... Can you... So you have all these tasks. But you want to know... You want to order them in sequence. So that... Everything moves smoothly. Nobody is waiting for another task. Because you've already completed all the prerequisite tasks... Prerequisite tasks... In order to do the current task. Alright? Now... The way CARP worded the problem... You see... I worded the problem the way the modern version is worded. It's the same problem. Uh... The difference is the following. I worded it... So that... That you have all these tasks. And the contractor... Uh... Enumerates... And says... How much each task individually costs. With the understanding... That... You don't pick and choose. That the contractor has to do all the tasks. And that leads up to the sum of the whole project. CARP technically... Wrote it... In a... In a penalty sense. Not a profit sense. The contractor... Makes an agreement with the... With the buyer... For the... Homeowner... What the cost of the project is. And then... The contractor lists all the tasks. And then... It's... The homeowner with the contractor decide... That if you do not make the deadline... Thereby causing potentially... A delay... Uh... So then... You will be penalized. I will... The homeowner says... I will subtract... From the total price... The total amount I owe you... I will subtract something... Because you... You thought... You said... You would finish... The task... By a certain day... And you didn't. So you have to pay... The penalty contractor. You have to give me... A late fee... And deduct that... From the price... Of the project. So that's how... CARP said it. I said it... That when the... Contractor finishes... The task... By the deadline... So then the... Then the... The person... The contract is paid... For that task. CARP words it... That the contractor... Gives the total price. And then if a task... Is not made by... Is not finished... By a certain... Agreed upon deadline... Internal deadline... Then that... Then there's a penalty... In which case... You deduct... The cost... A certain penalty... From the total amount owed. Anyway... The... The situation... What you're trying to do... Is in some sense... An optimization problem. What... What do we mean? Because you need to... Figure out... The contractor needs to... Figure out... A sequence of all these tasks... Such that... The contract... And such that... First of all... Uh... That... For any task... All the prerequisite tasks... Are ready before it... Because you're gonna do... Well... You will have done them already... Right? And to make sure... That this setup... This lineup... Doesn't cause delays... And that each task... Meets its internal... Agreed upon deadline... You're gonna finish the floor... By a certain day... You're gonna install the windows... By a certain day... The roof is gonna be on... By a certain day... The doors are gonna be installed... And the locks... And the electricity... And the plumbing... All these... Have pre-determined... Pre-exist... Agreed upon days... All right, so the contractor has to permute, you know, find the sequence of these tasks such that every task has its prerequisites before it. So you can assume they were done already. So that enables the current task to be done. And also that every task has been completed by the agreed-upon internal deadline. Anyway, that really long-winded problem is the job sequencing problem. The issue is that if I were to tell you the following sequence, it will get you the most profit, the way I originally stated, or in Karp's case, will be the least, in other words, this particular order of doing the sequence will be the least amount of penalty. How are you going to verify? Maybe there's a better sequence that will get you, in my case, more profit, or in Karp's case, less penalty. In other words, maybe there's a better order that you should do the tasks in that will be even more efficient. So the only way to verify that is by trying every possible sequence. But we mentioned last class that factorial is greater or equal to 2 to the n. So factorial is worse than exponential. So you can't just simply verify this. So that's why Karp did this by adding an extra parameter. By adding this extra parameter K, so then, one second, let me be consistent with the wording we wrote above, then it's transformed. The optimization problem, the underlying optimization problem, is transformed into an equivalent, into a related decision problem. Okay, so then Karp could then say it's NP-complete, not just hard. The reduction showed it was hard, but Karp needed that they wanted the verification to be easy, the verification to be reasonable. For us, it's polynomial amount of time. So we wanted to verify that. So Karp adds this parameter and says that the penalties should be less than or equal to K. Or in our case, the modern way of saying it is that the profit is greater or equal. .. In some sense, the profit, because it's from the contractor's point of view, the penalties to the contractor, so the profit is greater or equal to K. But again, that's not the way Karp worded it. Karp worded it the penalty. So the penalty is less than or equal to K. All right, anyway, so that's how Karp adding this one parameter, we just said here, is the parameter K is lateness... Here, I'm just bolding it. K is a user-supplied parameter for lateness cost tolerance. By adding this extra parameter K, an optimization problem is transformed into a related decision problem. All right, so that's the backdrop of Karp's problems. So what we'd like to do is to go into... So we're going to go into... Well, in a few minutes more, we're going to go into some of the Karp's problems. Now, I organized Karp's problems differently than Karp did, obviously. Okay? So we'll do the first group. We'll do the first group. In a different lecture, we'll continue to do the others. I have special tabs for some of these problems. The Steiner problem, which is related to the MST problem. MST stands for minimum spanning tree, which Karp quotes in page 89 of his paper. However, I have to give you an example of what Karp calls matching. The irony of which is that in order... Wow. The wording is funny. I find it funny. Hopefully you will as well. Well, whether you find it funny or not. The word matching... I don't think Karp actually introduced this word, but it's the name of the problem. Because the matching problem was already introduced in page 90 of his paper by someone else. But that problem was a P problem. It wasn't polynomial time. Anyway, so the word matching must have existed. I don't think that's the person. The funny part is that the word matching means you don't want to match. That's the oddity. As to why that's so, I have an example here. As I say, so matching. The irony is the fact that you do not want to match. Read what that is about. Fine. And then I have here a summary of all the tweaked problems. So here are a number of problems that Karp has. And then what I tell you is, so that's the NP perspective. And this is the P perspective. In other words, Karp has these NP problems on the left. And then Karp also has on page 89, 90, although he doesn't have all of them. I added one or two, but most of them he has. And those are the P problems that he quotes. So he's very purposeful in quoting his problems. He chooses problems that are, in fact, he chooses problems that are, in fact, in P. That the... let me rephrase that. He chooses problems where... In other words, again, on page 89 and 90, as I pointed out, on pages 89 and 90 of his paper, he has regular deterministic polynomial times, regular for loop solving problems. Good. Those problems are not the typical problems that you would have probably chosen if you were telling someone what P is. Right? Printing... printing... reading an array and printing out the contents of an array probably would have been your... the more typical examples of polynomial-timed algorithms. CARP has highly specialized polynomial-timed algorithms. Okay? So CARP has, again, highly specialized polynomial-timed algorithms. So what CARP does here... I don't think I can use that color now. Okay, I can't. So anyway, what CARP does is that... I'm sorry. What CARP does is... shows cousins that this problem on page 89 and 90 is in P. But the other problem later in my paper, CARP says, is in NP. So that's the listing. The summary of that listing is here in this tab called Tweet Problems. Anyway, just to give you a little taste on how I... I'm in the second tab, Problem Definitions. In terms of the... In terms of... Oh, whoa. We have a very important point we have to mention, which is that we have a quiz coming up. Hold on a second. Let me find... Do I have a syllabus in front of me? Hmm. Give me two seconds. I think we need... Instead of forcing more new material, I should find you the syllabus because I believe we have a quiz coming up. Please be a little patient with me. I'm sorry about being slightly... My own technology problem this morning. One second. Okay. So let me just verify this because this is quite important. According to the calendar and the syllabus, which you can download... Today's the 18th. Yes. So a week from today... A week from today is quiz number one. All right. So that has not changed. So a week from today is quiz number one. And what I'm going to do is a review of all our material I'm going to have with you on Monday. So Monday will be a review for the quiz. This coming Monday. Monday, February 23rd is a review for the quiz. And quiz number one will be on a week from today. All on Brightspace. Okay. A week from today... So on Monday's class, Monday's class will have a virtual classroom. The Wednesday will be no virtual classroom. The quizzes will be under the quiz tab on Brightspace. All right. And if you want more information about that, ask on Monday. And I will ask you, I'll answer you. One thing about my reviews. I don't come into class and I prefer not having to come to class and just start talking. I want you for your homework. And this is a homework that's sort of due. But in the sense though, don't email it to me and don't submit it to me. What's due is I want you to look over the notes and prepare questions for Monday. Otherwise, we're going to just sit there, you and I silently, and that's a waste of time. So please look over the notes that I've emailed you and we'll email you today's notes as well. I will give one little extra point about today's note in a second. But the fact is that all that is on whatever... Those notes are the basis of quiz number one. Please go through them and make note to yourself anything you want me to define. Maybe there's a key term you want me to go over. A formula you want me to plug into and give an example of. A theorem or concept you want me to go over the proof. Or that if we discuss the proof. Whatever it is, please come in on Monday with that request. Okay? Now, so that's Monday and then comes Wednesday. The issue about the notes, there's no issue. It's just I want to put a heads up. Basically, what I'm going to do is... Basically, I'm going to... Even though the notes here that I have have many tabs to it, that'll be overwhelming to you. And it's not on the quiz except what we discussed today of reductions. All right? And I will include the... The reason why... The notation... See below... Why this notation was included. I'll include it on the bottom. Okay? So, just for completion sake. Instead of spending a lot of extra time with you. All right? So, I will add that to the notes below. Why this notation was used rather. Okay? All right. So, the set of notes will be here. The set of notes will only be for this first tab. And then subsequent classes will have the other tabs. Fine. But please bring Carp's paper with you. That's what we're going to be doing after the quiz for a week and a half, maybe two weeks. There's a lot there and a lot to discuss. Very interesting, very practical applications. All right. If you did not stand to class, if you didn't have a chance, please text the word present. And thank you all. And thank you for your patience. And we will continue on next Monday. All right? If you didn't have a chance, kindly text the word present. And thank you all. And... Yeah. Sorry. Okay. Thank you all. Bye now. And I'll see you all on... I'll see you all on... Hopefully. For the review on Monday. Okay. How do we close this? The... The... The... This session will close in about 30 seconds. I don't know. It closes at 9.05. Thank you all. Thank you. All right? So... It may automatically shut us out in a few seconds. If... Okay. Thank you. Let me check. I'm curious how accurate is their time. Apparently six seconds. So if you want to wait and see if it automatically shuts us off in six seconds. I'm curious. And... It didn't? Well, that's... Oh, I know why. Oh, that's interesting. Okay. Now I know the answer. Anyway, thank you very much. And bye now. Okay. Got it. I understand. That's interesting.