CS 4500 Fall 2009 Out: September 18, 2009 Due: September 25, 2009 Karl Lieberherr ***** Subproject 2 ***** **** Introduction **** This subproject will already result in our first agent competition (the simpler game: Classic SCG CSP T Ball) thanks to the functioning agent "babies" that the teaching staff has prepared for you. Section 5.3 in the requirements document explains the different versions of the Classic SCG CSP game. The agents can communicate in a mutually understandable way and they can look through the store to find out what is for sale. What they cannot do well yet is to recognize good challenges to buy and types of challenges offer in the market at a price that will produce a profit. In class we did an experiment: to perform a requirements analysis of the "make money with challenge C" requirement, we played with a suitably bent coin. We determined that the average satisfaction ratio among all assignments generated with the suitably bent coin is a certain number in [0,1], called break-even(C). We use the fsat(I, J): the fraction of satisfied constraints in problem instance I and assignment J. Our agent can exploit this result in three ways: 1) If an agent sees a challenge C on sale at a price p < break-even(C), the agent should buy it in the hope that it quickly can find, for any given problem instance I for C, an assignment J with fsat(I, J) >= break-even(C). The agent knows such an assignment must exist and will look for the assignment by flipping a suitably bent coin. 2) The second way to use the result is to never offer a challenge C below the price break-even(C) because we know that for all problem instances I of challenge C there is an assignment J such that fsat(I, J) >= break-even(C). 3) The third way to use the result is to solve a given problem instance using the optimally biased coin. All Boolean relations used in the game are of arity 3 (of three arguments); there are 256 such relations, which we number 0..255. Important hint: Use Ahmed's Relation class, included in the scglibs.jar distributed with the player. The documentation can be found here, and the source is available in the course source directory Important methods: reduce(int, int) : used to compute the quality of a solution q(int) : used to compute the polynomials that need to be maximized. The Relation class will greatly simplify your work. An important detail, the relation: 22(a, b, c) is encoded as: c b a ----- 0 0 0 0 0 1 1 0 1 0 1 0 1 1 1 0 0 1 1 0 1 1 1 0 1 1 1 Notice the reverse order of the variables. **** Part 1: 15 points **** =============================================================================== Compute an upper bound on the price of a challenge C in Classic SCG CSP T Ball, at which you would accept the challenge. You would accept C if the price is below this bound because you know there exists an solution with which you will make a profit. More formally, we define break-even(C) for a challenge C to be the largest real number in [0,1] for which we can prove that for all problem instances I of challenge C there exists a solution J with fsat(I, J) >= break-even(C). Solution: Map C to a polynomial f[C] in b, where b is the bias of a bent coin used to assign variables in the solution. f[C](b) = the expected fraction of satisfied constraints when assignments are generated randomly using a bent coin with bias b. break-even(C) = f[C](b_max), where b_max is the value of b where f[C] (b) is maximum in [0,1]. **** Part 2: Accept Task: 3 points **** =============================================================================== Add intelligence to the accepting task of your baby agent that accepts all challenges C with a price p < break-even(C), provided you have enough funds available. **** Part 3: Offer Task: 3 points **** =============================================================================== Add intelligence to the offering task of your baby agent that only offers challenges C at a price p > break-even(C). The hope is that once such a challenge is accepted, your agent agent will produce a problem instance for which less than break-even(C) can be satisfied. We will focus on this task (providing) in later subprojects. But some of you might already have an idea of what should be done. **** Part 4: Solve Task: 4 points **** =============================================================================== Add intelligence to the solving task of your agent that uses an optimally biased coin to generate the solution assignments. Generate a few random assignments and take the best one. **** Part 5: Provide Task: 4 points **** =============================================================================== The providing task needs to create more complex problem instances. If you don't have any better ideas, you can use this problem instance (i.e., translate it into your providing task): R(v1 v2 v3) R(v1 v2 v4) R(v1 v2 v5) R(v1 v3 v4) R(v1 v3 v5) R(v1 v4 v5) R( v2 v3 v4) R( v2 v3 v5) R( v2 v4 v5) Our expectation is that if you add this knowledge to your agent, it will perform better than the original baby agent. We will include a plain baby agent in the competition for comparison. ***** Testing ***** =============================================================================== It is important to unit test all aspects of your player. We expect you to put some "Brain Power" in writing test cases, files, and extra test classes. Please demonstrate this to us by writing a reasonable number of test cases, and documenting your software accordingly. This will be part of your grade, and also leads to good habits in the future. ***** Submission ***** =============================================================================== You will turn in your player/agent to Blackboard as a Jar file containing only your source code. You will also place a runnable JAR of your player (without source) in /scratch/lieber/cs4500/f09/sep25/. The makeSubmission.sh script in the player distribution will create the needed archives for submission. Supposing your team name is "BryansTeam", run the script within your player's directory as: ./makeSubmission.sh "BryansTeam" The source JAR BryansTeam_source.jar will be created and should be uploaded to Blackboard. The runnable JAR, BryansTeam.jar, should be copied to /scratch/lieber/cs4500/f09/sep25/BryansTeam.jar to help other teams in testing later. Remember that your source code will be made available to the class at a later time (i.e., write good comments, and be professional). The submitted (runnable) agents will be made available to the entire class so that you can run local competitions, check the results, observe the behavior of all agents, and learn from them. Competitions will be held on the due date at midnight, started by an automated script. Players will be able to register (start running) around 6pm on the due date, which means the administrator will be up, accepting/registering participating players for the competition (more on that later). The competitions are expected to run in less than an hour, so the results should be available by the time you wake up the following day. *Important*: be sure that the computer on which you run your agent is reachable from the Internet! We suggest that you use one of the CCIS Linux machines, and will be going over competition setup in the second Lab.