Michael Negnevitsky: Artificial Intelligence. A Guide to Intelligent Systems, Addison Wesley, Second Edition, 2005.
Grading will be based on two exams, four projects and lab activity. Class participation will be also considered. Approximate weights assigned to them will be as follows:
Programming projects must represent your original work and must be developed completely independently by each one of the students. Copying from any sources (web, other books, past or current students, etc.) is strictly prohibited. No cooperation, discussion, sharing, exchange or consultation between the students themselves is allowed. This, of course, does not apply to any discussions taking place during the lectures under the supervision of the instructor.
You can report cheating anonymously at: https://www.cs.ucr.edu/cheating/.
Students violating this policy will be given a failing grade for the course and their case will be referred to the office of Vice-Chancellor for Student Affairs. On the other hand, students are strongly encouraged to cooperate and consult each other in preparation for the exams.
A class mailing list: cs171@lists.cs.ucr.edu will be established to disseminate information pertaining to this class. Make sure to sign-up for the mailing list in order to receive prompt information about class assignments, additional resources and other pertinent matters.
You can sign up for it at: https://www.cs.ucr.edu/mailman/listinfo/cs171
IMPORTANT: only UCR e-mail addresses will be allowed on the list!
Assigned: April 4, 2005
Due: April 17, 2005, 23:59 PM (Sunday), via turn-in
Project 1 text
Lab computers have freeware SWI Prolog installed on them. You can start the Prolog interpreter by typing "pl" on the command prompt. Alternatively, if you would like to instal Prolog on your home machine, you can download the installation from the web site at http://www.swi-prolog.org/.
I will cover enought Prolog to allow you to finish the project, however, if you wish to learn more, a particularly good book is Clocksin and Mellish, Programming in Prolog. 4th ed. New York: Springer-Verlag, 1994. Also, SWI Prolog manual can be helpful at times.
Assigned: April 25, 2005
Due: May 2, 2005, (Monday), in the discussion sesction (hardcopy)
Familiarize yourself with FuzzyCLIPS. Run the fuzzy shower example that comes with the FuzzyCLIPS distribution:
Assigned: April 25, 2005
Due: May 8, 2005, 23:59 PM (Sunday), via turn-in
Previous homework asked you to spend some time thinking of a particular
problem you are familiar with which can be modelled using fuzzy expert
systems, and code several rules as an exercise. Your second project is
to expand the draft rules you have turned in before into a full-fledged
working fuzzy system. See the shower example from homework
2 to get an idea of what the project should look like.
The shell we will be using for the fuzzy systems project is FuzzyCLIPS. The Windows version is installed on the school's Windows terminal server ferrari, and can be accessed from the labs by typing on the Unix command prompt:
rdesktop -d win -a 16 -f ferrariCLIPSWIN.EXE binary will be in C:\fzclips.
Alternatively, you can download a copy for your home machine.
Assigned: May 9, 2005
Part a) due: May 15, 2005, 23:59 PM (Sunday), via turn-in
Part b) due: May 22, 2005, 23:59 PM (Sunday), via turn-in
Project 3 asks you to code a back-propagation algorithm for training Neural Networks (NN), and conduct some experiments to get a better understanding how they work.
On the web site you will find the starter code for the Neural Networks project. Essentially, you have a fully written C++ Dataset class, and the header file for the NeuralNet class, with standard constructors, train, predict and test methods. What you have to write is the implementation for the functions that are described in the header file, the plus z-score normalization function.
Part A was to code the algorithm. Part B is to see how it works. The dataset that you will test your algorithm on is optical recognition of handwritten digits. It was split into two parts: the training set, which you should use to train your NN and the test set which you should use to evaluate how your learning algorithm performs. All the examples in both datasets are in the same format: comma-separated 64 features + the target class (0 or 1). For this particular examples, label 0 really means digit 0, and label 1 really means digit 1. (In other words, you should expect perfect accuracy since the two are quite distinct.) In general, if you were to distinguish between 3 and 8, you would label the 3-examples as 1, and 8-examples as 0 (or -1), or the other way round. There are benefits to labeling positives and negatives in this manner (can you think of why is this a good idea with NN.s?).
So, in part B, you have to perform experiments on your dataset and see how the learner behaves. In general, you should have a as many input nodes in the input layer as you have features (in general, you could do feature selection, but let.s not get into this), and as many output nodes as you have classes. Since this is a binary classification problem, you could (and should) get away with just one output node. For this experiment, you one hidden layer should suffice. Since you do not know the right number of neurons in the hidden layer, you will have to test different possibilities and see what works best.
What the experiment is asking you to do is to vary the number of neurons in the hidden layer (try 1, 2, 4, 8, 10) and number of epochs the NN learns (try 100, 200, 400, 800, 1000). Report performance (sensitivity and specificity) on the test set for all 25 combinations of the experimental conditions. Also, to get a better insight into how the NN is learning, observe and report on the 25 time series of MSE (mean-squared error) from one iteration of training to the other.
Then finally, write a 5-sentence paragraph to summarize your discoveries (which combination works best and why, which one was completely off and why, and so on). Submit a *PDF* copy of your report through the turn-in system.