1 / 23

Quantity-Yes (QY) matrix cells contain the quantity of that item by that user

SVD: How/when use pTrees to speed up training? Follow gradients to minimize SVD prediction error (mse over TrainSet) Classification/Prediction (the heart of Data Mining?) A taxonomy of classification situations might start with the number of entities in the Training Set:

amygonzalez
Download Presentation

Quantity-Yes (QY) matrix cells contain the quantity of that item by that user

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SVD: How/when use pTrees to speed up training?Follow gradients to minimize SVD prediction error (mse over TrainSet) Classification/Prediction (the heart of Data Mining?) A taxonomy of classification situations might start with the number of entities in the Training Set: 1 Entity TrainingSet (e.g., IRISs, Concrete Mixes, Wines, Seeds,...) use FAUST or CkNN or ???? (But always use pTree Technology!!!) 2 Entity TrainingSet (e.g., NetflixMovieRecommender(users,movies), MBR(users,items), TextMining(docs,terms), ;;; 3 Entity TrainingSet (e.g., Document Recomenders(users,docs,terms) Recommender Taxonomy Two Entity Recommenders (2Es) (e.g., Users and Items) D DT 1 1 0 0 0 1 0 1 4 0 0 0 1 1 0 0 0 3 1 0 1 0 0 1 1 1 2 0 0 0 1 1 1 0 0 1 DU 2 3 4 5 T 2 0 1 0 0 0 0 0 1 3 0 0 1 0 4 0 0 0 1 5 U UT Netflix = two entity (users, movies), 5-star rating (1-5) relationship (matrix) in which 85% or the potential TrainSet ratings are blank. User-ID-Yes (UY) 2 entity (users, items) s.t. users give identity at Check-Out (e.g., Sam's,Amazon? ...) User-ID-NO (UN) 2 entity (users, items) s.t. users do not give identity at Check-Out (e.g., Sunmart ...) Quantity-Yes (QY) matrix cells contain the quantity of that item by that user Quantity-NO (QN) matrix cells do not contain the quantity of the item by that user Ratings-Yes (RY) matrix cells always contain a qualitative rating of that item by that user Ratings-No (RN) matrix cells never contain a qualitative rating of that item by that user Ratings-Maybe (RM) matrix cells may contain a rating of that item by that user Three Entity Recommenders (3Es) (e.g., in the Document Recommender situation: Users, Documents, Terms) We have a Document-Term matrix (cells contain tfIDFs?). We have a User-Document matrix (a 2E where Doc=Item). We may have a User-Term matrix (cells contain a user's level of term liking (user's ratings of terms) (RM?) I think it is better to model this as 3 hop rolodex model (DT DU UT) (actually cyclic!), than a 3D matrix (DataCube model Netflix is a 2E UY,QN,RM. pTreeSVD works best for RY and RN? (0% blanks) (in RN, buy means rating=1 and don't buy means raing=0) Maybe pTrees are not best for Netflix? (prove me wrong ;-) Let's let uc=user_count, ic=item_count, fc=feature_count, bp=blanks_percentage In Netflix, uc=500K, ic=17K, fc=40, bp=85%. Therefore horizontal SVD converts a problem of using the 100M non-blank TrainSet ratings down to working with the two feature matrixes of sizes 500Kx40=20M and 17Kx40=700K (so a total of less than 21M). However, when we use pTrees in the RM setting, we do not get to ignore the blanks! Maybe pTrees will shine anyway? However, for a Document Recommender, there are no blanks! Therefore pTrees should shine!

  2. a b c d e f g h i j k l m n o p q r s t u v w x y z u/mTraining Data = 26 users (a thru z) 3 3 3 5 2 5 3 1 L8 movies (1 thru 8) 5 2 5 1 2 3 5 20.1 Learning Rate = L = 0.1 3 4 3 3 5 5 3Bias = B = 0.1 5 3 4 4 B 3 features (The initial feature ratings for 2 2 1 2 50.1all 36 dimensions are show as 4 1 1 4 6the first 3 rows below (round0) 2 4 3 2 5 7F=3 1 3 4 5 3 8 SVD Classification example. users (run1 with L=0.1 and B=0.1) a b c d e f g h i j k l m n o p q r round0 0.3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 round1 -0. 2.7 1.5 0.9 0.6 1.2 1.5 0.9 0.9 2.7 0.9 1.5 1.2 0.9 0.9 2.7 0.9 1.2 -0. 2.7 1.5 0.9 0.6 1.2 1.5 0.9 0.9 2.7 0.9 1.5 1.2 0.9 0.9 2.7 0.9 1.2 -0. 2.7 1.5 0.9 0.6 1.2 1.5 0.9 0.9 2.7 0.9 1.5 1.2 0.9 0.9 2.7 0.9 1.2 round2 3.0 1.0 0.8 0.2 -0. 0.5 -0. -0. 0.2 1.0 0.1 0.8 -0. 0.1 -0. 1.0 0.1 0.4 3.0 1.0 0.8 0.2 -0. 0.5 -0. -0. 0.2 1.0 0.1 0.8 -0. 0.1 -0. 1.0 0.1 0.4 3.0 1.0 0.8 0.2 -0. 0.5 -0. -0. 0.2 1.0 0.1 0.8 -0. 0.1 -0. 1.0 0.1 0.4 round3 -1. 1.0 0.8 0.2 0.6 0.5 1.6 0.9 0.2 1.1 0.1 0.9 1.3 0.1 0.9 1.0 0.1 0.4 -1. 1.0 0.8 0.2 0.6 0.5 1.6 0.9 0.2 1.1 0.1 0.9 1.3 0.1 0.9 1.0 0.1 0.4 -1. 1.0 0.8 0.2 0.6 0.5 1.6 0.9 0.2 1.1 0.1 0.9 1.3 0.1 0.9 1.0 0.1 0.4 round4 2.9 1.0 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.1 0.9 -0. 0.2 -0. 1.0 0.2 0.5 2.9 1.0 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.1 0.9 -0. 0.2 -0. 1.0 0.2 0.5 2.9 1.0 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.1 0.9 -0. 0.2 -0. 1.0 0.2 0.5 round5 -1. 1.0 0.9 0.3 0.6 0.5 1.6 0.9 0.3 1.1 0.1 0.9 1.2 0.1 0.9 1.1 0.1 0.5 -1. 1.0 0.9 0.3 0.6 0.5 1.6 0.9 0.3 1.1 0.1 0.9 1.2 0.1 0.9 1.1 0.1 0.5 -1. 1.0 0.9 0.3 0.6 0.5 1.6 0.9 0.3 1.1 0.1 0.9 1.2 0.1 0.9 1.1 0.1 0.5 round6 2.8 1.1 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.1 0.9 -0. 0.2 -0. 1.1 0.2 0.5 2.8 1.1 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.1 0.9 -0. 0.2 -0. 1.1 0.2 0.5 2.8 1.1 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.1 0.9 -0. 0.2 -0. 1.1 0.2 0.5 round7 -1. 1.1 0.9 0.3 0.6 0.5 1.6 0.9 0.3 1.1 0.2 0.9 1.2 0.2 0.9 1.1 0.2 0.5 -1. 1.1 0.9 0.3 0.6 0.5 1.6 0.9 0.3 1.1 0.2 0.9 1.2 0.2 0.9 1.1 0.2 0.5 -1. 1.1 0.9 0.3 0.6 0.5 1.6 0.9 0.3 1.1 0.2 0.9 1.2 0.2 0.9 1.1 0.2 0.5 round8 2.7 1.1 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.2 0.9 -0. 0.2 -0. 1.1 0.2 0.5 2.7 1.1 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.2 0.9 -0. 0.2 -0. 1.1 0.2 0.5 2.7 1.1 0.9 0.3 -0. 0.5 -0. -0. 0.3 1.1 0.2 0.9 -0. 0.2 -0. 1.1 0.2 0.5 round9 -0. 1.1 0.9 0.3 0.6 0.5 1.6 0.8 0.3 1.2 0.2 0.9 1.2 0.2 0.8 1.1 0.2 0.5 -0. 1.1 0.9 0.3 0.6 0.5 1.6 0.8 0.3 1.2 0.2 0.9 1.2 0.2 0.8 1.1 0.2 0.5 -0. 1.1 0.9 0.3 0.6 0.5 1.6 0.8 0.3 1.2 0.2 0.9 1.2 0.2 0.8 1.1 0.2 0.5 Run2 with L=0.1 and B=0: round0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 round1 -0. 2.7 1.5 0.9 0.6 1.2 1.5 0.9 0.9 2.7 0.9 1.5 1.2 0.9 0.9 2.7 0.9 1.2 -0. 2.7 1.5 0.9 0.6 1.2 1.5 0.9 0.9 2.7 0.9 1.5 1.2 0.9 0.9 2.7 0.9 1.2 -0. 2.7 1.5 0.9 0.6 1.2 1.5 0.9 0.9 2.7 0.9 1.5 1.2 0.9 0.9 2.7 0.9 1.2 round2 3.1 0.9 0.8 0.1 -0. 0.4 -0. -1. 0.1 0.9 0.0 0.8 -0. 0.0 -0. 0.9 0.0 0.4 3.1 0.9 0.8 0.1 -0. 0.4 -0. -1. 0.1 0.9 0.0 0.8 -0. 0.0 -0. 0.9 0.0 0.4 3.1 0.9 0.8 0.1 -0. 0.4 -0. -1. 0.1 0.9 0.0 0.8 -0. 0.0 -0. 0.9 0.0 0.4 movies s t u v w x y z 1 2 3 4 5 6 7 8 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 mse 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.53 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 3 0 0.9 1.8 0.6 2.1 3 2.1 3.2 3.3 3.4 3.6 3.0 3.4 3.4 3.5 mse 3 0 0.9 1.8 0.6 2.1 3 2.1 3.2 3.3 3.4 3.6 3.0 3.4 3.4 3.5 2.078 3 0 0.9 1.8 0.6 2.1 3 2.1 3.2 3.3 3.4 3.6 3.0 3.4 3.4 3.5 1.3 0 -0. 0.0 -0. 0.3 1.3 0.5 3.2 3.2 3.5 3.5 3.1 3.4 3.3 3.5 mse 1.3 0 -0. 0.0 -0. 0.3 1.3 0.5 3.2 3.2 3.5 3.5 3.1 3.4 3.3 3.5 2.072 1.3 0 -0. 0.0 -0. 0.3 1.3 0.5 3.2 3.2 3.5 3.5 3.1 3.4 3.3 3.5 1.3 0 0.9 0.0 0.6 0.4 1.3 0.5 3.0 3.1 3.3 3.5 3.1 3.3 3.4 3.5 mse 1.3 0 0.9 0.0 0.6 0.4 1.3 0.5 3.0 3.1 3.3 3.5 3.1 3.3 3.4 3.5 2.117 1.3 0 0.9 0.0 0.6 0.4 1.3 0.5 3.0 3.1 3.3 3.5 3.1 3.3 3.4 3.5 1.3 0 -0. 0.1 -0. 0.4 1.4 0.5 3.0 3.0 3.3 3.4 3.2 3.3 3.3 3.4 mse 1.3 0 -0. 0.1 -0. 0.4 1.4 0.5 3.0 3.0 3.3 3.4 3.2 3.3 3.3 3.4 2.188 1.3 0 -0. 0.1 -0. 0.4 1.4 0.5 3.0 3.0 3.3 3.4 3.2 3.3 3.3 3.4 1.3 0 0.9 0.1 0.6 0.5 1.4 0.5 2.9 2.9 3.2 3.4 3.1 3.2 3.3 3.5 mse 1.3 0 0.9 0.1 0.6 0.5 1.4 0.5 2.9 2.9 3.2 3.4 3.1 3.2 3.3 3.5 2.172 1.3 0 0.9 0.1 0.6 0.5 1.4 0.5 2.9 2.9 3.2 3.4 3.1 3.2 3.3 3.5 1.4 0 -0. 0.1 -0. 0.5 1.4 0.5 2.9 2.8 3.3 3.3 3.2 3.2 3.2 3.4 mse 1.4 0 -0. 0.1 -0. 0.5 1.4 0.5 2.9 2.8 3.3 3.3 3.2 3.2 3.2 3.4 2.303 1.4 0 -0. 0.1 -0. 0.5 1.4 0.5 2.9 2.8 3.3 3.3 3.2 3.2 3.2 3.4 1.4 0 0.9 0.1 0.6 0.6 1.4 0.5 2.7 2.8 3.2 3.3 3.2 3.2 3.2 3.4 mse 1.4 0 0.9 0.1 0.6 0.6 1.4 0.5 2.7 2.8 3.2 3.3 3.2 3.2 3.2 3.4 2.233 1.4 0 0.9 0.1 0.6 0.6 1.4 0.5 2.7 2.8 3.2 3.3 3.2 3.2 3.2 3.4 1.4 0 -0. 0.2 -0. 0.6 1.4 0.5 2.7 2.7 3.2 3.3 3.2 3.1 3.1 3.4 mse 1.4 0 -0. 0.2 -0. 0.6 1.4 0.5 2.7 2.7 3.2 3.3 3.2 3.1 3.1 3.4 2.411 1.4 0 -0. 0.2 -0. 0.6 1.4 0.5 2.7 2.7 3.2 3.3 3.2 3.1 3.1 3.4 1.4 0 0.8 0.2 0.6 0.6 1.4 0.5 2.6 2.7 3.1 3.2 3.2 3.1 3.2 3.4 mse 1.4 0 0.8 0.2 0.6 0.6 1.4 0.5 2.6 2.7 3.1 3.2 3.2 3.1 3.2 3.4 2.294 1.4 0 0.8 0.2 0.6 0.6 1.4 0.5 2.6 2.7 3.1 3.2 3.2 3.1 3.2 3.4 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.5 ms 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0.1 L 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0 B 3 0 0.9 1.8 0.6 2.1 3 2.1 3.4 3.6 3.6 3.7 3.2 3.5 3.6 3.7 2.12 ms 3 0 0.9 1.8 0.6 2.1 3 2.1 3.4 3.6 3.6 3.7 3.2 3.5 3.6 3.7 0.1 L 3 0 0.9 1.8 0.6 2.1 3 2.1 3.4 3.6 3.6 3.7 3.2 3.5 3.6 3.7 0 B 1.2 0 -1. -0. -0. 0.2 1.3 0.4 3.6 3.7 3.8 3.7 3.4 3.6 3.7 3.8 2.35 ms 1.2 0 -1. -0. -0. 0.2 1.3 0.4 3.6 3.7 3.8 3.7 3.4 3.6 3.7 3.8 0.1 L 1.2 0 -1. -0. -0. 0.2 1.3 0.4 3.6 3.7 3.8 3.7 3.4 3.6 3.7 3.8 0 B Run3 with L=0.001 and B=0 round0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 round1 0.3 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 round2 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.53 mse 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0.001 L 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0 B 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 3.0 3.0 3.0 3.0 2.9 3.0 3.0 2.9 1.804 mse 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 3.0 3.0 3.0 3.0 2.9 3.0 3.0 2.9 0.001 L 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 3.0 3.0 3.0 3.0 2.9 3.0 3.0 2.9 0 B 0.3 0 0.3 0.3 0.3 0.3 0.3 0.3 3.0 3.0 3.0 3.0 2.9 2.9 3.0 3.0 1.804 mse 0.3 0 0.3 0.3 0.3 0.3 0.3 0.3 3.0 3.0 3.0 3.0 2.9 2.9 3.0 3.0 0.001 L 0.3 0 0.3 0.3 0.3 0.3 0.3 0.3 3.0 3.0 3.0 3.0 2.9 2.9 3.0 3.0 0 B The min mse ~1.8? and L=0.001 seems to get us there and keep us there. Next, 2 features instead of 3? What is suggested: LRATE and BIAS can be adjusted to improve the convergence rate. LRATE substitutes line search but I still don't know what BIAS does in the training update formulas. It seems to be a matter of using L*Error/xi + B*Error/xj instead of just L*Error/xi. Note that the three Feature vectors remain identical throughout the training. This vector ends up as the Primary eigenvector? And the corresponding eigenvalue is the singular value?

  3. Perfectly consistent ratings data: a b c d e f g h i j k l m n o p q r s u v w x y z A little more consistent ratings data: a b c d e f g h i j k l m n o p q r s u v w x y z Perfectly consistent ratings data? (more to learn?) a b c d e f g h i j k l m n o p q r s u v w x y z 1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8 3 3 3 4 2 4 3 3 2 3 1 2 3 3 3 4 3 3 5 5 5 3 4 2 2 1 2 4 1 1 4 2 4 2 2 5 1 3 4 5 3 3 3 3 3 3 3 3 3 155 3 1 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 1 3 1 3 55 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 Run1 with L=0.1 and B=0.1 and F=2: round0 (I took out user=t) mse a b c d e f g h i j k l m n o p q r s u v w x y z 1 2 3 4 5 6 7 8 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.53 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 round1 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 2.078 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 round2 3 1 1 0 **0 ****0 1 0 1 **0 **1 0 0 1 **0 **0 1 0 3 3 3 3 3 3 3 3 2.070 3 1 1 0 **0 ****0 1 0 1 **0 **1 0 0 1 **0 **0 1 0 3 3 3 3 3 3 3 3 round3 **1 1 0 0 0 1 0 0 1 0 1 1 0 0 1 0 0 1 0 0 0 0 1 0 3 3 3 3 3 3 3 3 2.105 **1 1 0 0 0 1 0 0 1 0 1 1 0 0 1 0 0 1 0 0 0 0 1 0 3 3 3 3 3 3 3 3 mse is starting to bounce.Using 2 featurs is as good as 3 (not surprising since the reamain identical) More Run1 with L=0.001 and B=0 and F=10: round0 (took out user=t) 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 9.804 ''' 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 1.395 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 1.395 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 3 3 2 3 3 3v Run2 with L=0.001 and B=0 and F=2: round0 (Lowering LRATE) mse a b c d e f g h i j k l m n o p q r s u v w x y z 1 2 3 4 5 6 7 8 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.53 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 round1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 1.804 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 round2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 1.804 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 Run1 with L=0.001 and B=0 and F=10: round0 (Using more features) mse 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.53 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0.001 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 1.809 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0.001 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 1.809 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0.001 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 mse 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 L 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 B 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 3 Run1 with L=0.001 and B=0 and F=2: round0 (took out user=t) 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 9.804 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 1.390 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 Run1 with L=0.1 and B=0.1 and F=2: round1 (took out user=t) 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0.121 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0 1 0 0 0 0 1 1 0 1 0 0 1 0 1 1 0 0 1 1 1 0 1 1 1 3 3 3 3 3 3 3 3 0.070 0 1 0 0 0 0 1 1 0 1 0 0 1 0 1 1 0 0 1 1 1 0 1 1 1 3 3 3 3 3 3 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 2 3 3 3 3 3 0.016 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 2 3 3 3 3 3 Run1 with L=0.001 and B=0.001 and F=10: round0 mse **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 2.048 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 0.1 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 0.1 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 **2 1 0 0 1 1 0 0 2 0 1 1 0 0 2 0 1 3 0 1 0 2 3 2 3 3 3 3 3 3 3 3 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 3.073 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 0.1 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 0.1 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 3 0 0 0 **0 ****0 0 **0 ******0 **0 1 ****0 0 1 0 3 3 3 3 3 3 3 3 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 2.076 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 0.1 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 0.1 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 **0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 1 0 2 0 0 1 0 3 3 3 3 3 3 3 3 Run1 with L=0.1 and B=0.1 and F=2: 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 8.902 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0 1 0 0 0 0 1 0 0 3 0 0 3 0 1 1 0 0 1 1 1 0 0 1 1 3 3 3 3 3 3 3 3 0.803 0 1 0 0 0 0 1 0 0 3 0 0 3 0 1 1 0 0 1 1 1 0 0 1 1 3 3 3 3 3 3 3 3 0 0 0 0 0 0 0 **0 1 0 0 1 0 0 0 0 0 0 0 0 0 **0 0 2 3 2 3 3 3 3 3 0.375 0 0 0 0 0 0 0 **0 1 0 0 1 0 0 0 0 0 0 0 0 0 **0 0 2 3 2 3 3 3 3 3 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 2 3 2 3 3 3 3 2 0.758 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 2 3 2 3 3 3 3 2 0 0 0 0 0 0 0 **0 1 0 0 1 0 0 0 0 0 0 0 0 0 **0 0 2 3 2 2 2 3 3 2 0.479 0 0 0 0 0 0 0 **0 1 0 0 1 0 0 0 0 0 0 0 0 0 **0 0 2 3 2 2 2 3 3 2

  4. 1.31623049 mse 0.03 L 1.31552142 mse 0.03 L 1.31490461 mse 0.03 L 1.31437350 mse 0.03 L 1.31392193 mse 0.03 L 1.31354414 mse 0.03 L 1.31323475 mse 0.03 L 1.31298869 mse 0.03 L 1.31280125 mse 0.03 L 1.31266800 mse 0.03 L 1.31258479 mse 0.03 L 1.31254774 mse 0.03 L 1.31255324 mse 0.03 L 1.31259789 mse 0.03 L 1.31267852 mse 0.03 L 1.31279215 mse 0.03 L 1.31293603 mse 0.03 L 1.31310754 mse 0.03 L 1.31330428 mse 0.03 L 1.31352398 mse 0.03 L 1.31376453 mse 0.03 L 1.31402395 mse 0.03 L 1.31430041 mse 0.03 L 1.31408436 mse 0.01 L 1.31358460 mse 0.01 L 1.31309280 mse 0.01 L 1.31260881 mse 0.01 L 1.31213250 mse 0.01 L 1.31166374 mse 0.01 L 1.31120241 mse 0.01 L 1.31074838 mse 0.01 L 1.31030153 mse 0.01 L 1.30986174 mse 0.01 L 1.30942888 mse 0.01 L 1.30900285 mse 0.01 L 1.30858352 mse 0.01 L 1.30817079 mse 0.01 L 1.30776455 mse 0.01 L 1.30736468 mse 0.01 L a b c d e f g h i j k l m n -0.4 -0.4 -0.4 1.53 1.63 -1.3 1.63 -2.3 -0.8 0.12 -0.8 0.97 -1.0 0.23 0.23 1.42 -1.5 -1.2 0.74 -0.2 -2.2 -0.2 0.98 1.00 1.00 0.99 0.99 1.00 0.99 0.99 0.99 1.00 0.99 1.00 0.99 0.99 a b c d e f g h i j k l m n 0 0 0 2 2 -1 2 -2 0 1 0 2 0 -1 -1 1 -2 -1 1 0 -2 0 0.73 1.27 1.18 1 0.91 1.09 0.91 0.73 1 1.27 1 1.18 0.82 1 o p q r s t u v w x y zfinal errors mse=1.27684192 -1.4 1.53 -0.4 3.46 -1.3 -0.3 1.63 3.36 -0.8 1.12 1.12 3.87 -.02 4.02 -0.7 0.23 1.76 -1.5 1.42 2.57 -1.2 1.74 3.25 0.72 1.72 -0.2 3.27 0.99 1.00 0.99 0.99 1.01 0 0.99 0.99 0.99 1.00 1.01 1.00 o p q r s t u v w x y zinitial errors mse=1.644444444 -1 2 0 3.01 -1 0 2 3.01 0 2 2 3.03 1 3.03 -2 -1 2.96 -2 1 2.99 -1 2 3.01 1 2 0 3.01 0.73 1.27 1 1.09 1.36 0 0.73 1 0.91 1.09 1.36 1.09 1.62029855 mse 0.03 L 1.59776227 mse 0.03 L 1.57673260 mse 0.03 L 1.55711307 mse 0.03 L 1.53881331 mse 0.03 L 1.52174869 mse 0.03 L 1.50583998 mse 0.03 L 1.49101299 mse 0.03 L 1.47719825 mse 0.03 L 1.46433073 mse 0.03 L 1.45234954 mse 0.03 L 1.44119768 mse 0.03 L 1.43082179 mse 0.03 L 1.42117191 mse 0.03 L 1.41220126 mse 0.03 L 1.40386605 mse 0.03 L 1.39612528 mse 0.03 L 1.38894056 mse 0.03 L 1.38227595 mse 0.03 L 1.37609778 mse 0.03 L 1.37037454 mse 0.03 L 1.36507670 mse 0.03 L 1.36017664 mse 0.03 L 1.35564845 mse 0.03 L 1.35146790 mse 0.03 L 1.34761227 mse 0.03 L 1.34406030 mse 0.03 L 1.34079204 mse 0.03 L 1.33778885 mse 0.03 L 1.33503323 mse 0.03 L 1.33250880 mse 0.03 L 1.33020023 mse 0.03 L 1.32809312 mse 0.03 L 1.32617403 mse 0.03 L 1.32443032 mse 0.03 L 1.32285017 mse 0.03 L 1.32142251 mse 0.03 L 1.32013697 mse 0.03 L 1.30697108 mse 0.01 L 1.30658364 mse 0.01 L 1.30620227 mse 0.01 L 1.30582686 mse 0.01 L 1.30545730 mse 0.01 L 1.30509351 mse 0.01 L 1.30473538 mse 0.01 L 1.30438283 mse 0.01 L 1.30403576 mse 0.01 L 1.30369407 mse 0.01 L 1.30335768 mse 0.01 L 1.30302650 mse 0.01 L 1.30270044 mse 0.01 L 1.30237942 mse 0.01 L 1.30206336 mse 0.01 L 1.30175217 mse 0.01 L 1.30144577 mse 0.01 L 1.30114408 mse 0.01 L 1.30084703 mse 0.01 L 1.30055453 mse 0.01 L ... 1.29123668 mse 0.01 L 1.29108776 mse 0.01 L 1.29094099 mse 0.01 L 1.29079634 mse 0.01 L 1.29065376 mse 0.01 L 1.29051323 mse 0.01 L 1.29037471 mse 0.01 L 1.29023818 mse 0.01 L 1.29010360 mse 0.01 L 1.28997093 mse 0.01 L 1.28984016 mse 0.01 L 1.28971125 mse 0.01 L 1.28958416 mse 0.01 L 1.28945888 mse 0.01 L 1.28933538 mse 0.01 L 1.28921362 mse 0.01 L 1.28909358 mse 0.01 L Using just 1 feature, bias=0, ratings: 3 3 3 5 2 5 3 5 2 5 1 2 3 5 3 4 3 3 5 5 5 3 4 2 2 1 2 4 1 1 4 2 4 3 2 5 1 3 4 5 3 1.28897523 mse 0.01 L 1.28885855 mse 0.01 L 1.28874350 mse 0.01 L 1.28863007 mse 0.01 L 1.28851823 mse 0.01 L 1.28840795 mse 0.01 L 1.28829922 mse 0.01 L 1.28819200 mse 0.01 L 1.28808627 mse 0.01 L 1.28798201 mse 0.01 L ... 1.28392336 mse 0.01 L 1.28387469 mse 0.01 L 1.28382665 mse 0.01 L 1.28377921 mse 0.01 L 1.28373237 mse 0.01 L 1.28368612 mse 0.01 L 1.28364045 mse 0.01 L 1.28359536 mse 0.01 L 1.28355083 mse 0.01 L 1.28350687 mse 0.01 L 1.28346345 mse 0.01 L 1.28342057 mse 0.01 L 1.28337823 mse 0.01 L 1.28333642 mse 0.01 L 1.28329512 mse 0.01 L 1.28325434 mse 0.01 L 1.28321407 mse 0.01 L 1.28317429 mse 0.01 L 1.28313501 mse 0.01 L 1.28309621 mse 0.01 L 1.28305788 mse 0.01 L 1.28302003 mse 0.01 L 1.28298265 mse 0.01 L 1.28294572 mse 0.01 L 1.28290924 mse 0.01 L ... 1.27690832 mse 0.001 L 1.27690315 mse 0.001 L 1.27689800 mse 0.001 L 1.27689285 mse 0.001 L 1.27688772 mse 0.001 L 1.27688259 mse 0.001 L 1.27687747 mse 0.001 L 1.27687237 mse 0.001 L 1.27686727 mse 0.001 L 1.27686218 mse 0.001 L 1.27685710 mse 0.001 L 1.27685203 mse 0.001 L 1.27684697 mse 0.001 L 1.27684192 mse 0.001 L

  5. mse L 0.27222578 0.01 mse L 0.27187466 0.01 mse L 0.27152459 0.01 mse L 0.27117557 0.01 mse L 0.27082759 0.01 mse L 0.27048065 0.01 mse L 0.27013475 0.01 mse L 3.73213940 0.01 mse L 1.59355250 0.01 mse L 0.88102960 0.01 mse L 0.59204743 0.01 mse L 0.45747486 0.01 mse L 0.39007264 0.01 mse L 0.35515036 0.01 a b c d e f g h i j k l m n 0.62 -0.0 -0.0 -0.0 -0.0 0.45 0.42 -1.0 0.66 -0.0 -0.0 0.37 -0.0 -0.1 -0.4 -0.3 -0.4 -0.0 -0.4 1.04 -1.2 0.41 0.25 0.49 0.53 0.32 0.22 0.43 0.29 0.16 0.32 0.49 0.32 0.53 0.21 0.32 0.25 0.49 0.53 0.32 0.22 0.43 0.29 0.16 0.32 0.49 0.32 0.53 0.21 0.32 0.25 0.49 0.53 0.32 0.22 0.43 0.29 0.16 0.32 0.49 0.32 0.53 0.21 0.32 a b c d e f g h i j k l m n 0.31 -0.0 -0.0 -0.0 -0.0 0.39 0.35 -1.0 0.48 0.00 0.00 0.17 -0.0 0.43 -0.0 -0.1 -0.4 -0.0 -0.3 1.08 -1.2 0.05 0.27 0.47 0.50 0.30 0.21 0.43 0.36 0.16 0.30 0.47 0.29 0.50 0.20 0.32 0.27 0.47 0.50 0.30 0.21 0.43 0.36 0.16 0.30 0.47 0.29 0.50 0.20 0.32 0.27 0.47 0.50 0.30 0.21 0.43 0.36 0.16 0.30 0.47 0.29 0.50 0.20 0.32 o p q r s u v w x y zfinal mse= 0.27013475 0.38 0.36 -0.0 3.09 3.09 3.09 0.45 -0.5 1.21 3.10 3.10 3.10 -0.0 -0.0 -0.0 3.05 3.05 3.05 -0.0 3.11 3.11 3.11 -0.4 -1.3 2.75 2.75 2.75 -0.4 0.59 2.96 2.96 2.96 -0.0 0.02 3.01 3.01 3.01 -0.3 0.10 0.08 2.92 2.92 2.92 0.17 0.49 0.32 0.42 0.55 0.16 0.33 0.22 0.38 0.54 0.40 0.17 0.49 0.32 0.42 0.55 0.16 0.33 0.22 0.38 0.54 0.40 0.17 0.49 0.32 0.42 0.55 0.16 0.33 0.22 0.38 0.54 0.40 o p q r s u v w x y zfinal mse= 0.27013475 0.07 0.07 -0.2 3.26 3.26 3.26 0.39 -0.6 0.40 3.26 3.26 3.26 0.00 -0.2 0.01 3.06 3.06 3.06 -0.0 3.37 3.37 3.37 -0.1 -0.6 1.90 1.90 1.90 -0.4 0.72 2.93 2.93 2.93 -0.0 -0.0 3.08 3.08 3.08 -0.0 0.31 0.29 2.71 2.71 2.71 0.19 0.50 0.32 0.39 0.57 0.16 0.33 0.21 0.37 0.54 0.46 3.26 3.26 3.06 0.19 0.50 0.32 0.39 0.57 0.16 0.33 0.21 0.37 0.54 0.46 3.26 3.26 3.06 0.19 0.50 0.32 0.39 0.57 0.16 0.33 0.21 0.37 0.54 0.46 3.26 3.26 3.06 a b c d e f g h i j k l m n 0 3 3 5 5 2 5 1 0 4 3 5 3 -1 2 4 1 2 4 3 -2 3 0.24 0.27 0.15 0.09 0.06 0.12 0.15 0.09 0.09 0.27 0.09 0.15 0.12 0.09 0.24 0.27 0.15 0.09 0.06 0.12 0.15 0.09 0.09 0.27 0.09 0.15 0.12 0.09 0.24 0.27 0.15 0.09 0.06 0.12 0.15 0.09 0.09 0.27 0.09 0.15 0.12 0.09 o p q r s u v w x y z initial xse=11.53 2 5 3 3.00 3.00 3.00 2 3 5 3.00 3.00 3.00 3 5 5 3.00 3.00 3.00 4 3.00 3.00 3.00 1 2 3.00 3.00 3.00 1 4 3.00 3.00 3.00 2 5 3.00 3.00 3.00 4 5 3 3.00 3.00 3.00 0.09 0.27 0.09 0.12 0.3 0.09 0.18 0.06 0.21 0.3 0.21 3.00 3.00 3.00 0.09 0.27 0.09 0.12 0.3 0.09 0.18 0.06 0.21 0.3 0.21 3.00 3.00 3.00 0.09 0.27 0.09 0.12 0.3 0.09 0.18 0.06 0.21 0.3 0.21 3.00 3.00 3.00 after arprox 1000 rounds: mse L 0.17756823 0.01 mse L 0.17753659 0.01 mse L 0.17750509 0.01 mse L 0.17747371 0.01 mse L 0.17744246 0.01 mse L 0.17741135 0.01 Using 3 features bias=0 ratings-1 3 3 3 5 2 5 3 5 2 5 1 2 3 5 3 4 3 3 5 5 5 3 4 2 2 1 2 4 1 1 4 2 4 3 2 5 1 3 4 5 3

  6. 0.96885944 ... 0.38172333 0.38141890 0.38111545 0.38081298 0.38051147 0.38021092 ... 0.26872642 0.26872600 0.26872559 0.26872517 0.26872477 0.26872437 1.7009393 (intial) Making the f's different initially a b c d e f g h i j k l m n 3 3 3 5 5 2 5 1 3 4 3 5 3 2 2 4 1 2 4 3 1 3 0.06 0.12 0.6 0.35 0.28 0.38 0.25 0.2 0.58 0.33 0.25 0.27 0.4 0.07 0.53 0.31 0.2 0.07 0.64 0.28 0.5 0.4 0.16 0.16 0.5 0.27 0.35 0.64 0.4 0.56 0.2 0.57 0.07 0.33 0.25 0.4 0.25 0.5 0.25 0.44 0.25 0.28 a b c d e f g h i j k l m n 0.22 -0.0 -0.0 -0.0 -0.0 0.18 0.95 -0.6 0.18 0.00 0.00 -0.4 0.00 0.81 0.27 0.99 -0.4 0.00 -1.3 0.90 -0.4 -0.0 -0.0 0.27 0.80 0.46 0.14 0.36 0.31 0.11 0.62 0.44 0.15 0.52 0.25 0.00 0.44 0.46 0.40 0.18 0.50 0.27 0.56 0.31 0.21 0.28 0.40 0.52 0.20 0.57 0.31 0.71 0.40 0.68 -0.0 0.31 0.31 0.31 0.29 0.61 0.15 0.69 0.10 0.21 a b c d e f g h i j k l m n -0.4 -0.0 -0.0 -0.0 -0.0 0.39 0.50 -0.7 0.67 0.00 0.00 0.04 -0.0 0.70 0.09 -0.0 -0.7 0.00 -0.8 1.25 -0.4 -0.0 -0.0 0.26 0.75 0.37 0.16 0.43 0.32 0.07 0.54 0.45 0.16 0.40 0.24 0.05 0.45 0.45 0.35 0.08 0.52 0.34 0.57 0.27 0.13 0.28 0.41 0.40 0.19 0.63 0.32 0.70 0.35 0.58 -0.0 0.38 0.32 0.27 0.21 0.62 0.16 0.57 0.09 0.27 o p q r s u v w x y z L=.01 B=0 mses: 2 5 3 2 5 1 2 3 5 4 1 3 3 5 5 5 4 4 4 4 5 3 1 2 1 2 1 1 4 1 1 2 2 5 3 4 5 4 5 3 5 0 5 0.26 0.4 0.16 0.37 0.45 0.18 0.37 0.18 0.16 0.38 0.44 0.33 0.13 0.08 0.5 0.27 0.40 0.5 0.45 0.33 0.46 0.22 0.4 0.46 0.75 0.12 0.27 0.40 0.12 0.36 0.5 0.15 0.33 o p q r s u v w x y z L=.01 B=0 mses ~300 rds -0.0 2.07 -0.8 2.34 5.34 1.34 0.19 -0.5 0.02 4.09 1.09 3.09 0.00 -0.4 -0.2 4.80 3.80 3.80 0.00 3.87 4.87 2.87 0.07 -0.1 0.98 1.98 0.98 -0.5 1.31 1.56 1.56 2.56 0.00 0.39 2.86 3.86 4.86 -0.9 0.45 0.40 4.95 -0.0 4.95 0.16 0.47 0.08 0.35 0.55 0.10 0.38 0.00 0.27 0.46 0.67 0.23 0.20 0.00 0.47 0.36 0.32 0.51 0.27 0.44 0.54 0.44 0.29 0.53 0.67 0.10 0.36 0.32 0.13 0.18 0.60 0.23 0.55 o p q r s u v w x y z L=.01 B=0 mses ~300 rds -0.0 1.18 -1.3 3.14 6.14 2.14 0.41 -0.3 0.01 4.38 1.38 3.38 0.00 -0.0 -0.0 4.04 3.04 3.04 -0.0 3.68 4.68 2.68 0.20 -0.0 1.09 2.09 1.09 -0.7 0.68 2.38 2.38 3.38 0.00 0.09 2.46 3.46 4.46 -0.7 0.06 0.85 4.82 -0.1 4.82 0.12 0.46 0.14 0.36 0.61 0.05 0.35 0.02 0.21 0.54 0.61 0.18 0.19 0.06 0.49 0.42 0.28 0.48 0.29 0.38 0.62 0.39 0.25 0.53 0.72 0.11 0.42 0.28 0.10 0.20 0.55 0.31 0.50 We note that the change in mse is down to .00000040 or so and can't go much further (with this system). The mse=.26872437 is much higher than the previous mse=.17741135 when starting with f1=f2=f3=(1,0...0, 3...3). Why? It's p1, v8, v1, x6, j2, a4 and a6 that are causing the drastic increase in error. Why? As engineers, we will of course start with equal f=vectors! Define the fk-vector (34-D) vector of the users fk-values a-z and item values, 1-8. Both f2-f1and f3-f1 remain constant through all rounds. This all suggest using f1=f2=f3 initially (final f1=f2=f3 will be the max eigen direction) all zeros except the component of min variance????

  7. initial: 1.8048780 After ~1000rnds 0.1693101 initial: 1.8048780 Same but with just 1 feature a b c d e f g h i j k l m n 3 3 3 5 5 2 5 1 3 4 3 5 3 2 2 4 1 2 4 3 1 3 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 a b c d e f g h i j k l m n 0.24 0.00 0.00 0.00 0.00 0.42 0.37 -1.0 0.45 0.00 0.00 0.11 -0.0 0.59 0.06 -0.1 -0.4 0.00 -0.3 1.07 -1.2 -0.0 0.28 0.46 0.51 0.30 0.21 0.44 0.38 0.16 0.30 0.47 0.28 0.51 0.20 0.33 0.28 0.46 0.51 0.30 0.21 0.44 0.38 0.16 0.30 0.47 0.28 0.51 0.20 0.33 0.28 0.46 0.51 0.30 0.21 0.44 0.38 0.16 0.30 0.47 0.28 0.51 0.20 0.33 a b c d e f g h i j k l m n 3 3 3 5 5 2 5 1 3 4 3 5 3 2 2 4 1 2 4 3 1 3 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 o p q r s u v w x y z L=.01 B=0 mses: 2 5 3 3 3 3 2 3 5 3 3 3 3 5 5 3 3 3 4 3 3 3 1 2 3 3 3 1 4 3 3 3 2 5 3 3 3 4 5 3 3 3 3 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 o p q r s u v w x y z L=.01 B=0 mses: 0.01 0.01 -0.2 3.26 3.26 3.26 0.42 -0.6 0.23 3.25 3.25 3.25 0.00 -0.3 0.06 3.01 3.01 3.01 -0.0 3.53 3.53 3.53 -0.0 -0.4 1.67 1.67 1.67 -0.4 0.66 2.99 2.99 2.99 0.00 -0.0 3.09 3.09 3.09 -0.0 0.36 0.35 2.62 2.62 2.62 0.20 0.50 0.33 0.37 0.58 0.16 0.33 0.21 0.37 0.54 0.48 0.20 0.50 0.33 0.37 0.58 0.16 0.33 0.21 0.37 0.54 0.48 0.20 0.50 0.33 0.37 0.58 0.16 0.33 0.21 0.37 0.54 0.48 o p q r s u v w x y z L=.01 B=0 mses: 2 5 3 3 3 3 2 3 5 3 3 3 3 5 5 3 3 3 4 3 3 3 1 2 3 3 3 1 4 3 3 3 2 5 3 3 3 4 5 3 3 3 3 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33 0.33

  8. Using 1 feature a b c d e f g h i j k l m n o p q r s t u v w x y z 3 3 3 5 2 5 3 1 5 2 5 1 2 3 5 2 3 4 3 3 5 5 3 5 3 4 4 2 2 1 2 5 4 1 1 4 6 2 4 3 2 5 7 1 3 4 5 3 8 a b c d e f g h i j k l m n 0.24 0.00 0.00 0.00 -0.0 0.44 0.37 -1.0 0.45 0.00 0.00 0.08 -0.0 0.60 0.07 -0.1 -0.4 0.00 -0.3 1.07 -1.2 -0.0 0.84 1.31 1.52 0.92 0.64 1.33 1.16 0.47 0.92 1.41 0.80 1.53 0.61 0.99 a b c d e f g h i j k l m n 0.24 0.00 0.00 0.00 -0.0 0.44 0.37 -1.0 0.45 0.00 0.00 0.08 -0.0 0.60 0.07 -0.1 -0.4 0.00 -0.3 1.07 -1.2 -0.0 0.84 1.31 1.52 0.92 0.64 1.33 1.16 0.47 0.92 1.41 0.80 1.53 0.61 0.99 a b c d e f g h i j k l m n 0.34 0.04 0.04 0.07 0.07 0.52 0.42 -0.9 0.46 0.06 0.05 0.04 0.02 0.60 0.11 0.04 -0.4 0.03 -0.3 1.10 -1.1 0.04 1.95 2.79 3.51 2.17 1.47 3.03 2.64 1.04 2.17 3.25 1.67 3.63 1.42 2.27 a b c d e f g h i j k l m n 0.39 0.09 0.09 0.15 0.14 0.54 0.53 -0.9 0.45 0.13 0.10 0.03 0.04 0.60 0.14 0.08 -0.4 0.06 -0.3 1.07 -1.1 0.08 1.38 2.00 2.53 1.54 1.03 2.09 1.83 0.75 1.54 2.33 1.19 2.56 1.03 1.57 o p q r s u v w x y z L=.01 B=0 mses: 0.00 0.00 -0.2 3.25 0.44 -0.5 0.20 3.26 0.00 -0.3 0.08 3.00 -0.0 3.72 -0.0 -0.4 1.64 -0.4 0.60 3.10 0.00 -0.0 3.10 -0.0 0.37 0.36 2.60 0.61 1.53 0.99 1.07 1.77 0.47 1.01 0.64 1.09 1.63 1.46 o p q r s u v w x y z L=.01 B=0 mses: 0.00 0.00 -0.2 3.25 0.44 -0.5 0.20 3.26 0.00 -0.3 0.08 3.00 -0.0 3.72 -0.0 -0.4 1.64 -0.4 0.60 3.10 0.00 -0.0 3.10 -0.0 0.37 0.36 2.60 0.61 1.53 0.99 1.07 1.77 0.47 1.01 0.64 1.09 1.63 1.46 o p q r s u v w x y zL=.02B=.02 mses: 0.06 0.13 -0.2 1.35 0.52 -0.4 0.30 1.40 0.05 -0.2 0.13 1.29 0.03 1.77 -0.0 -0.3 0.71 -0.4 0.52 1.41 0.03 0.00 1.33 -0.0 0.46 0.35 1.11 1.43 3.58 2.27 2.23 4.05 1.04 2.36 1.47 2.45 3.75 3.34 1.35 o p q r s u v w x y zL=.03B=.03 0.09 0.22 -0.1 1.88 0.54 -0.4 0.40 1.91 0.10 -0.2 0.13 1.84 0.05 2.48 -0.0 -0.4 1.00 -0.4 0.52 1.95 0.06 0.07 1.86 -0.0 0.50 0.34 1.58 1.01 2.53 1.57 1.58 2.82 0.75 1.67 1.03 1.78 2.63 2.40 1.88 initial: 1.8048780 w 1 feature ... 0.1525256014 0.1525223805 0.1525191641 0.1525159522 ~1000 rnds. initial: 1.8048780 w 1 feature ... 0.1525223805 0.1525191641 0.1525159522 ~1000 rnds. initial: 1.8048780 ... 0.153612621 0.153612003 0.153611387 ~1000 rnds. (Converged even more rapid at 1st Drops below .16 in 146 rounds.

  9. 3-hop ct(&eARe  mnsup 2 2 3 3 4 4 5 5 ct(&f&eAReSf)  mnsp 1 1 1 0 1 1 0 0 0 1 0 0 0 0 1 1 ct(&eARe &glist&hCThSg ) ct(&flist&eAReSf &hCTh) / ct(&flist&eAReSf) /ct(&eARe / ct(&f=2,5Sf / ct(1101 & 0011 ) ct(&f=2,5Sf ct(1101 & 0011 & &1101 ) &1101 ) Focus on F Are they different? Yes, because the confidences can be different numbers. Focus on G. / ct(0001) = 1/1 =1 ct(0001 ) ct(&eARe &g&hCThSg) / ct(&eARe  mncnf ct( 1001 &g=1,3,4 Sg ) /ct(1001) ct(PA)  mnsup C H ct( 1001 &1001&1000&1100) / 2 / ct(PA)  mncnf ct(PA & Rf) G S(F,G) Focus on E f&g&hCThSg ct( 1000 ) / 2 = 1/2 0 1 0 1 4 0 0 0 1 3 1 0 1 0 antecedent upward closure: A infreq. implies subsets infreq. A 0-hops from E (up) 2 0 0 0 1 1 consequent downward closure: AC noncnf implies AD noncnf. DC. C 3-hops (down) T(G,H) Focus on H ct(&f&eAReSf &hCTh) F / ct(&f&eAReSf) mnsp mncnf  mncnf ct(& Tg & PC) g&f&eAReSf 0 1 0 0 4 0 0 0 1 3 ct(& Tg) /ct(& Tg) g&f&eAReSf g&f&eAReSf 0 0 1 0 2 antecedent downward closure: A infreq. implies all subsets infreq. A 3-hops from G (down) 0 0 0 1 1 consequent upward closure: AC noncnf impl AD noncnf. DC. C 0-hops (up) A R(E,F) E Collapse T: TC≡ {gG|T(g,h) hC} That's just 2-hop case w TCG replacing C. ( can be replaced by  or any other quantifier. The choice of quantifier should match that intended for C.). Collapse T and S: STC≡{fF |S(f,g) gTC} Then it's 1-hop w STC replacing C. Focus on F antecedent downward closure: A infreq. implies supersets infreq. A 1-hop from F (down consequent upward closure: AC noncnf implies AD noncnf. DC. C 2-hops (up Focus on G antecedent upward closure: A infreq. implies all subsets infreq. A 2-hop from G (up) consequent downward closure: AC noncnf impl AD noncnf. DC. C 1-hops (down)

  10. Simon Funk: Netflix provided a database of 100M ratings (1 to 5) of 17K movies by 500K users. as a triplet of numbers: (User,Movie,Rating). The challenge: For (User,Movie,?) not in the database, predict how the given User would rate the given Movie. Think of the data as a big sparsely filled matrix, with userIDs across the top and movieIDs down the side (or vice versa then transpose everything), and each cell contains an observed rating (1-5) for that movie (row) by that user (column), or is blank meaning you don't know. This matrix would have 8.5B entries, but you are only given values for 1/85th of those 8.5B cells (or 100M of them). The rest are all blank. Netflix posed a "quiz" of a bunch of question marks plopped into previously blank slots, and your job is to fill in best-guess ratings in their place. Squared error (se) measures accuracy (You guess=1.5, actual=2, you get docked (2-1.5)2=.25. They use root mean squared error (rmse) but if we minimize mse, we minimize rmse. There is a date for ratings and question marks (so a cell can potentially have >=1 rating in it. Any movie can be described in terms of some features (or aspects) such as quality, action, comedy, stars (e.g., Pitt), producer, etc. A user's preferences can be described in terms how they rate the same features (quality/action/comedy/star/producer/etc.). Then ratings ought to be explainable by a lot less than 8.5 billion numbers (e.g., a single number specifying how much action a particular movie has may help explain why a few million action-buffs like that movie.). SVD:Assume 40 features. A movie, m, is described by mF[40] = how much that movie exemplifies each aspect. A user, u, is described by uF[40] = how much he likes each aspect. Pu,m=uFomFerru,m=Pu,m- ru,m  m=1..17K; u=1..500K()2/8.5B k=1..40uFk*mFk - ru,m mse = k=1..40uFk*mFk - ru,m (2/8.5B) m=1..17K; u=1..500K (erru,m)[ mse/uFh = ( )/uFh] uFk ] (2/8.5B) m=1..17K; u=1..500K (erru,m)[ mse/mFh = mFh ] = (2/8.5B) m=1..17K; u=1..500K (erru,m)[ Pm1 m m17K u1 . . u500K UTa1 a40 u1 u500K Mm1 m m17K a1 mF a40 = o u uF u Pu,m = k=1..40uFk*mFk - ru,m So, we increment each uFh+ = 2mse * mFh and we increment each mFh+ = 2mse * uFh+ This is a big move and may overshoot the minimum, so the 2 is replaced by a smaller learning rate, lrate (e.g., Funk takes lrate=0.001) SVD is a trick which finds UT, M which minimize mse(k) (one k at a time). So, the rank=40 SVD of the 8.5B Training matrix, is the best (least error) approx we can get within limits of our user-movie-rating model. I.e., the SVD has found the "best" feature generalizations. To get the SVD matrixes we take the gradient of mse(k) and follow it.This has a bonus - we can ignore the unknown error on the 8.4B empty slots. Take gradient of mse(k) (just the given values, not empties), one k at a time. userValue[user] += lrate*err*movieValue[movie]; movieValue[movie] += lrate*err*userValue[user]; With Horizontal data, the code is evaluated for each rating. So, to train for one sample:real *userValue= userFeature[featureBeingTrained]; real *movieValue= movieFeature[featureBeingTrained]; real lrate = 0.001; More correctly: uv = userValue[user] += err * movieValue[movie]; movieValue[movie] += err * uv; finds the most prominent feature remaining (most reduces error). When it's good, shift it onto done features, start a new one (cache residuals of the 100M. "What does that mean for us???). This Gradient descent has no local minima, which means it doesn't really matter how it's initialized. ua+= lrate (u,i * iaT -  * ua ) where u,i = pu,i - ru,i andru,i = actual rating

  11. Refinements:Prior to starting SVD, Note: AvgRating(movie), AvgOffset(UserRating, MovieAvgRating), for every user. I.e.: static inline real predictRating_Baseline(int movie, int user) {return averageRating[movie] + averageOffset[user];} So, that's the return value of predictRating before the first SVD feature even starts training. You'd think avg rating for a movie would just be... its average rating! Alas, Occam's razor was a little rusty that day. If m only appears once with r(m,u)=1 say, AvgRating(m)=1? Probably not! View r(m,u)=1 as a draw from a true prob dist who's avg you want... View that true average itself as a draw from a prob dist of averages--the histogram of average movie ratings. Assume both distributions Gaussian, then the best-guess mean should be lin combo of observed mean and apriori mean, with a blending ratio equal to the ratio of variances. If Ra and Va are the mean and variance (squared standard deviation) of all of the movies' average ratings (which defines your prior expectation for a new movie's average rating before you've observed any actual ratings) and Vb is the average variance of individual movie ratings (which tells you how indicative each new observation is of the true mean--e.g,. if the average variance is low, then ratings tend to be near the movie's true mean, whereas if the avg variance is high, ratings tend to be more random and less indicative) then: BogusMean = sum(ObservedRatings)/count(ObservedRatings) K = Vb/Va BetterMean = [GlobalAverage*K + sum(ObservedRatings)] / [K + count(ObservedRatings)] The point here is simply that any time you're averaging a small number of examples, the true average is most likely nearer the apriori average than the sparsely observed average. Note if the number of observed ratings for a particular movie is zero, the BetterMean (best guess) above defaults to the global average movie rating as one would expect. Moving on: 20M free params is a lot for a 100M TrainSet. Seems neat to just ignore all blanks, but we have expectations about them. As-is, this modified SVD algorithm tends to make a mess of sparsely observed movies or users. If you have a user who has only rated 1 movie, say American Beauty=2 while the avg is 4.5, and further that their offset is only -1, we'd, prior to SVD, expect them to rate it 3.5. So the error given to the SVD is -1.5 (the true rating is 1.5 less than we expect). m(Action) is training up to measure the amount of Action, say, .01 for American Beauty (ust slightly more than avg). SVD optimize predictions, which it can do by eventually setting our user's preference for Action to a huge -150. I.e., the alg naively looks at the only example it has of this user's preferences and in the context of only the one feature it knows about so far (Action), determines that our user so hates action movies that even the tiniest bit of action in American Beauty makes it suck a lot more than it otherwise might. This is not a problem for users we have lots of observations for because those random apparent correlations average out and the true trends dominate. We need to account for priors. As with the average movie ratings, blend our sparse observations in with some sort of prior, but it's a little less clear how to do that with this incremental algorithm. But if you look at where the incremental algorithm theoretically converges, you get: userValue[user] = [sum residual[user,movie]*movieValue[movie]] / [sum (movieValue[movie]^2)] The numerator there will fall in a roughly zero-mean Gaussian distribution when charted over all users, which through various gyrations: userValue[user] = [sum residual[user,movie]*movieValue[movie]] / [sum (movieValue[movie]^2 + K)] And finally back to: userValue[user] += lrate * (err * movieValue[movie] - K * userValue[user]); movieValue[movie] += lrate * (err * userValue[user] - K * movieValue[movie]); This is equivalent to penalizing the magnitude of the features. To cut over fitting, allowing use of more features.

  12. Moving on: Linear models are limiting. We've bastardized the whole matrix analogy so much that we aren't really restricted to linear models: We can add non-linear outputs such that instead of predicting with: sum (userFeature[f][user] * movieFeature[f][movie]) for f from 1 to 40. We can use: sum G(userFeature[f][user] * movieFeature[f][movie]) for f from 1 to 40. Two choices for G proved useful. 1. clip the prediction to 1-5 after each component is added. E.g., each feature is limited to only swaying rating within the valid range, and any excess beyond that is lost rather than carried over. So, if the first feature suggests +10 on a scale of 1-5, and the second feature suggests -1, then instead of getting a 5 for the final clipped score, it gets a 4 because the score was clipped after each stage. The intuitive rationale here is that we tend to reserve the top of our scale for the perfect movie, and the bottom for one with no redeeming qualities whatsoever, and so there's a sort of measuring back from the edges that we do with each aspect independently. More pragmatically, since the target range has a known limit, clipping is guaranteed to improve our perf, and having trained a stage with clipping on, use it with clipping on. I did not really play with this extensively enough to determine there wasn't a better strategy. A second choice for G is to introduce some functional non-linearity such as a sigmoid. I.e., G(x) = sigmoid(x). Even if G is fixed, this requires modifying the learning rule slightly to include the slope of G, but that's straightforward. The next question is how to adapt G to the data. I tried a couple of options, including an adaptive sigmoid, but the most general and the one that worked the best was to simply fit a piecewise linear approximation to the true output/output curve. That is, if you plot the true output of a given stage vs the average target output, the linear model assumes this is a nice 45 degree line. But in truth, for the first feature for instance, you end up with a kink around the origin such that the impact of negative values is greater than the impact of positive ones. That is, for two groups of users with opposite preferences, each side tends to penalize more strongly than the other side rewards for the same quality. Or put another way, below-average quality (subjective) hurts more than above-average quality helps. There is also a bit of a sigmoid to the natural data beyond just what is accounted for by the clipping. The linear model can't account for these, so it just finds a middle compromise; but even at this compromise, the inherent non-linearity shows through in an actual-output vs. average-target-output plot, and if G is then simply set to fit this, the model can further adapt with this new performance edge, which leads to potentially more beneficial non-linearity and so on... This introduces new free parameters and encourages over fitting especially for the later features which tend to represent small groups. We found it beneficial to use this non-linearity only for the first twenty or so features and to disable it after that. Moving on: Despite the regularization term in the final incremental law above, over fitting remains a problem. Plotting the progress over time, the probe rmse eventually turns upward and starts getting worse (even though the training error is still inching down). We found that simply choosing a fixed number of training epochs appropriate to the learning rate and regularization constant resulted in the best overall performance. I think for the numbers mentioned above it was about 120 epochs per feature, at which point the feature was considered done and we moved on to the next before it started over fitting. Note that now it does matter how you initialize the vectors: Since we're stopping the path before it gets to the (common) end, where we started will affect where we are at that point. I wonder if a better regularization couldn't eliminate overfitting altogether, something like Dirichlet priors in an EM approach--but I tried that and a few others and none worked as well as the above. Here is the probe and training rmse for the first few features with and w/o regularization term "decay" enabled. Same thing, just the probe set rmse, further along where you can see the regularized version pulling ahead: This time showing probe rmse (vertical) against train rmse (horizontal). Note how the regularized version has better probe performance relative to the training performance: Anyway, that's about it. I've tried a few other ideas over the last couple of weeks, including a couple of ways of using the date information, and while many of them have worked well up front, none held their advantage long enough to actually improve the final result. If you notice any obvious errors or have reasonably quick suggestions for better notation or whatnot to make this explanation more clear, let me know. And of course, I'd love to hear what y'all are doing and how well it's working, whether it's improvements to the above or something completely different. Whatever you're willing to share,

  13. ( Mi, ProbeSup(Mi)={Ui1, …, Uik})  mpp-mpred.C Loops thru ProbeSup, from uservote, movieVOTE writes Predict(Mi,Uik) to predictions  UikProbeSup(Mi) ( Mi , Sup(Mi), Uik , Sup(Uik ))  ( Mi , Sup(Mi), Uik , Sup(Uik ))   vote(Mi ,Uik )  VOTE(Mi ,Uik ) mpp-user.C movie-vote.C user-vote.C prune.C Cinematch use Training Table Rents(MID,UID,Rating,Date), to classify new (MID,UID,Date) (i.e., predict ratings). Nearest Neighbor User Voting: uid votes on rating(MID,UID) if it is near enough to UID in it’s ratings of movies M={mid1, ..., midk} (i.e., near is based on a User-User correlation over M ). User-User-Correlation? (Pearson, Cosine?) and the set M={mid1,…, midk }. Nearest Neighbor Movie Voting: mid votes on rating(MID,UID) if its ratings by U={uid1,..., uidk} are near enough to those of MID (i.e., near is based on a Movie-Movie correlation over U). Movie-Movie-Correlation? (Pearson or Cosine or?) and set U={uid1,…, uidk }. mpp-mpred.C reads PROBE, loops thru (Mi, ProbeSup(Mi), pass each to mpp-user.C. mpp-mpred.C can call separate instances of mpp-user.C for many Us (in parallel (governed by # of slots.) mpp-user.C loops thru ProbeSup(M), reads config file, prints prediciton(M,U) to predictions For user votes, mpp-user.C calls user-vote.C For movie votes, mpp-user.C calls movie-vote.C user-vote.C prunes, loops thru user voters, V. calculating a V-vote. Combines V-votes and returns vote. movie-vote.C similar. We must loop thru V’s (VPHD rather than HPVD) because the HP required of most correlation calculations is impossible using AND/OR/COMP. The data mining algorithms in movie-vote.C (first Nearest Neighbor Classification code) Similar (dual) code either exists or will exist in user-vote.C. The file, movie-vote-full.C, contains ARM attempts, Boundary-based attempts and the Nearest Neighbor Classification attempts. The file, movie-vote-justNN.C contains only the NN attempts (so we will start with that).

  14. extern double movie_vote(PredictionConfig *pcfg, // 2010_11_13 notes unsigned long int M, PTree & supportM, unsigned long int U, PTree & supportU) {auto double MU=Users.get_rating(U,M)-2, VOTE=DEFAULT_VOTE,VOTE_sum=0,VOTE_cnt=0,Nb,Mb,dsSq,UCor=1, supportUsize=supportU.get_count(), supportMsize=supportM.get_count(); struct pruning *internal_prune; struct external_prune *external_prune; auto PTree supM=supportM, supU=supportU; supM.clearbit(U); supU.clearbit(M); movie-vote.C code /* External pruning: Prune Users supM */ external_prune = pcfg->get_movie_Prune_Users_in_SupM(); if (external_prune->enabled) { if(supM.get_count()>external_prune->params.Ct) do_pruning(external_prune, M, U, supM, supU); supM.clearbit(U); supU.clearbit(M); if((supM.get_count()<1)||(supU.get_count()<1)) return VOTE;} /* External pruning: Prune Movies supU */ external_prune = pcfg->get_movie_Prune_Movies_in_SupU(); if (external_prune->enabled) { if(supU.get_count()>external_prune->params.Ct ) do_pruning(external_prune, M, U, supM, supU); supM.clearbit(U); supU.clearbit(M); if((supM.get_count()<1) || (supU.get_count()<1) ) return VOTE; auto PTreeSet & U_ptree_set= Users.get_ptreeset(), & M_ptree_set=Movies.get_ptreeset(); supU.clearbit(M); supM.clearbit(U); auto PTree supU_1=supU&(~U_ptree_set[(U*3)+0])&( U_ptree_set[(U*3)+1])&( U_ptree_set[(U*3)+2]), supU_2=supU&( U_ptree_set[(U*3)+0])&(~U_ptree_set[(U*3)+1])&(~U_ptree_set[(U*3)+2]), supU_3=supU&( U_ptree_set[(U*3)+0])&(~U_ptree_set[(U*3)+1])&( U_ptree_set[(U*3)+2]), supU_4=supU&( U_ptree_set[(U*3)+0])&( U_ptree_set[(U*3)+1])&(~U_ptree_set[(U*3)+2]), supU_5=supU&( U_ptree_set[(U*3)+0])&( U_ptree_set[(U*3)+1])&( U_ptree_set[(U*3)+2]), supM_1=supM&(~M_ptree_set[(M*3)+0])&( M_ptree_set[(M*3)+1])&( M_ptree_set[(M*3)+2]), supM_2=supM&( M_ptree_set[(M*3)+0])&(~M_ptree_set[(M*3)+1])&(~M_ptree_set[(M*3)+2]), supM_3=supM&( M_ptree_set[(M*3)+0])&(~M_ptree_set[(M*3)+1])&( M_ptree_set[(M*3)+2]), supM_4=supM&( M_ptree_set[(M*3)+0])&( M_ptree_set[(M*3)+1])&(~M_ptree_set[(M*3)+2]), supM_5=supM&( M_ptree_set[(M*3)+0])&( M_ptree_set[(M*3)+1])&( M_ptree_set[(M*3)+2]), sou, souM, souU, som, somU, somM, spM, spU; auto double thr1, expnt1, thr2, expnt2, s, S, ss, sn, sM, sU, c, C, wt, XBalVT, wt_const=16;

  15. /* Nearest Neighbor Code */ supU.clearbit(M); auto unsigned long long int *supUlist = supU.get_indexes(); for ( unsigned long long int n= 0; n < supU.get_count(); ++n) //NLOOP (voters) {auto unsigned long long int N=supUlist[n]; if (N == M) continue; auto double NU=Users.get_rating(U,N)-2,MAX=0,smN=0,smM=0,MM=0,MN=0,NN=0,denom=0,dm; auto PTree supN=Movies.get_users(N), csMN= supM & supN; csMN.clearbit(U); dm=csMN.get_count(); if(dm<1) continue; /* External pruning: PRUNE USERS CoSupMN */ external_prune = pcfg->get_movie_Prune_Users_in_CoSupMN(); if (external_prune->enabled) { if(csMN.get_count()>external_prune->params.Ct) do_pruning(external_prune,M,U,csMN,supU); csMN.clearbit(U); supU.clearbit(M); dm=csMN.get_count(); if(dm<1) continue;} /*Adjusted Cosine*/auto double ACCor,Vbar,ACCnum=0,ACCden,ACCdenSum1=0,ACCdenSum2=0; auto unsigned long long int *csMNlist=csMN.get_indexes(); for (unsigned long long int v= 0; v < csMN.get_count(); ++v){ //VLOOP (dims) auto unsigned long long int V= csMNlist[v]; auto double MV=Users.get_rating(V,M)-2, NV=Users.get_rating(V,N)-2; if(pow(MV-NV,2) > MAX) MAX=pow(MV-NV,2); smN+=NV; smM+=MV; MM+=MV*MV; MN+=NV*MV; NN+=NV*NV; ++denom; /* Adjusted Cosine code */ auto PTree supV=Users.get_movies(V); Vbar=Users.get_mean(V,supV); ACCnum+=(NV-Vbar)*(MV-Vbar); ACCdenSum1+=(NV-Vbar)*(NV-Vbar); ACCdenSum2+=(MV-Vbar)*(MV-Vbar); } //VLOOP ends movie-vote.C NN code /* Adjusted Cosine code */ ACCden=pow(ACCdenSum1,.5)*pow(ACCdenSum2,.5);ACCor=ACCnum/ACCden;UCor=ACCor;dm=csMN.get_count(); if(denom<1) continue; else {Nb=smN/dm; Mb=smM/dm; dsSq=NN-2*MN MM; VOTE=NU-Nb+Mb;} if (UCor>0) {VOTE_sum+=VOTE*UCor; VOTE_cnt+=UCor; } else continue; if ( pcfg->movie_vote_force_in_loop() ) { if((VOTE<1)&&(VOTE!=DEFAULT_VOTE))VOTE=1;if((VOTE>5)&&(VOTE!=DEFAULT_VOTE))VOTE=5;}} /*end NLOOP*/ if ( VOTE_cnt>0 ) VOTE=VOTE_sum/VOTE_cnt; else VOTE=DEFAULT_VOTE; /* force_vote_after_Voter_Loop goes here. */ if ( pcfg->movie_vote_force_after_loop() ) { if ( (VOTE < 1) && (VOTE != DEFAULT_VOTE) ) VOTE=1; if ( (VOTE > 5) && (VOTE != DEFAULT_VOTE) ) VOTE=5; return VOTE;}

  16. submit script run in scr/mpp-mpred-3.2.0 produces subdirs in mpp-mpred-3.2.0 : and e.g., a.9.9 contains: -rw-r--r-- 1 a.9.9.config -rw-r--r-- 1 hi-a.9.9.txt -rw-r--r-- hi-a.9.9.txt.answers -rw-r--r-- lo-a.9.9.txt -rw-r--r-- lo-a.9.9.txt.answers -rw-r--r-- p95test.txt.predictions -rw-r--r-- p95test.txt.rmse drwxr-xr-x 2 10:15 a.1.1 drwxr-xr-x 2 10:15 a.1.2 drwxr-xr-x 2 10:15 a.1.4 drwxr-xr-x 2 10:15 a.1.7 drwxr-xr-x 2 10:15 a.1.9 drwxr-xr-x 2 10:15 a.2.1 drwxr-xr-x 2 10:15 a.2.2 drwxr-xr-x 2 10:16 a.2.4 drwxr-xr-x 2 10:16 a.2.7 drwxr-xr-x 2 10:16 a.2.9 drwxr-xr-x 2 10:16 a.4.1 drwxr-xr-x 2 10:16 a.4.2 drwxr-xr-x 2 10:16 a.4.4 drwxr-xr-x 2 10:16 a.4.7 drwxr-xr-x 2 10:17 a.4.9 drwxr-xr-x 2 10:17 a.7.1 drwxr-xr-x 2 10:17 a.7.2 drwxr-xr-x 2 10:17 a.7.4 drwxr-xr-x 2 10:17 a.7.7 drwxr-xr-x 2 10:17 a.7.9 drwxr-xr-x 2 10:17 a.9.1 drwxr-xr-x 2 10:18 a.9.2 drwxr-xr-x 2 10:18 a.9.4 drwxr-xr-x 2 10:18 a.9.7 drwxr-xr-x 2 10:18 a.9.9 dotouts is a script, createtablejob:#!/bin/bash for g in .1 .2 .4 .7 .9 do; for h in .1 .2 .4 .7 .9 do grep Input:\ \ \ lo a$g$h.out >> job submitin src/mpp-mpred-3.2.0 produces here #!/bin/bash for g in .1 .2 .4 .7 .9 do; for h in .1 .2 .4 .7 .9 do ./mpp-submit -S -i Data/p95test.txt -c p95/mu11/configs a$g$h.out -t .05 -d ./p95/mu11 creates in src.mpp-mpred-3.2.0/p95/mu11/configs: -rw-r--r-- 1 a.1.1.config -rw-r--r-- 1 a.1.2.config -rw-r--r-- 1 a.1.4.config -rw-r--r-- 1 a.1.7.config -rw-r--r-- 1 a.1.9.config -rw-r--r-- 1 a.2.1.config -rw-r--r-- 1 a.2.2.co -rw-r--r-- 1 a.2.4.co -rw-r--r-- 1 a.2.7.co -rw-r--r-- 1 a.2.9.co -rw-r--r-- 1 a.4.1.co -rw-r--r-- 1 a.4.2.co -rw-r--r-- 1 a.4.4.co -rw-r--r-- 1 a.4.7.co -rw-r--r-- 1 a.4.9.co -rw-r--r-- 1 a.7.1.co -rw-r--r-- 1 a.7.2.co -rw-r--r-- 1 a.7.4.co -rw-r--r-- 1 a.7.7.co -rw-r--r-- 1 a.7.9.co -rw-r--r-- 1 a.9.1.co -rw-r--r-- 1 a.9.2.co -rw-r--r-- 1 a.9.4.co -rw-r--r-- 1 a.9.7.co -rw-r--r-- 1 a.9.9.co Input: lo-a.1.1.txt Input: lo-a.1.2.txt Input: lo-a.1.4.txt Input: lo-a.1.7.txt Input: lo-a.1.9.txt Input: lo-a.2.1.txt Input: lo-a.2.2.txt Input: lo-a.2.4.txt Input: lo-a.2.7.txt Input: lo-a.2.9.txt Input: lo-a.4.1.txt Input: lo-a.4.2.txt Input: lo-a.4.4.txt Input: lo-a.4.7.txt Input: lo-a.4.9.txt Input: lo-a.7.1.txt Input: lo-a.7.2.txt Input: lo-a.7.4.txt Input: lo-a.7.7.txt Input: lo-a.7.9.txt Input: lo-a.9.1.txt Input: lo-a.9.2.txt Input: lo-a.9.4.txt Input: lo-a.9.7.txt Input: lo-a.9.9.txt .predictions 12641: 1.22 3.65 2.55 4.04 1.85 12502: 4.71 3.54 3.87 3.33 2.97 : . 10811: 4.05 3.49 3.94 3.39 12069: 3.20 3.48 p95test.txt.rmse Movie: 12641: 0: Ans: 1 Pred: 1.22 Error: 0.04840 1: Ans: 4 Pred: 3.65 Error: 0.12250 2: Ans: 2 Pred: 2.55 Error: 0.30250 3: Ans: 4 Pred: 4.04 Error: 0.00160 4: Ans: 2 Pred: 1.85 Error: 0.02250 Sum: 0.49750 Total: 5 RMSE: 0.315436 Running RMSE: 0.315436 / 5 predictions Movie: 12502: 0: Ans: 4 Pred: 4.71 Error: 0.50410 1: Ans: 5 Pred: 3.54 Error: 2.13160 2: Ans: 5 Pred: 3.87 Error: 1.2769 3: Ans: 3 Pred: 3.33 Error: 0.10890 4: Ans: 2 Pred: 2.97 Error: 0.94090 Sum: 4.96240 Total: 5 RMSE: 0.996233 Running RMSE: .738911 /10 predictions : Movie: 10811 0: Ans: 5 Pred: 4.05 Error: 0.90250 1: Ans: 3 Pred: 3.49 Error: 0.24010 2: Ans: 4 Pred: 3.94 Error: 0.00360 3: Ans: 3 Pred: 3.39 Error: 0.15210 Sum: 1.29830 Total: 4 RMSE: 0.569715 Running RMSE: 0.964397 / 743 preds Movie: 12069: 0: Ans: 4 Pred: 3.20 Error: 0.64000 1: Ans: 3 Pred: 3.48 Error: 0.23040 Sum: 0.87040 Total: 2 RMSE: 0.659697 Prediction summary: Sum: 691.9061 Total:745 RMSE: .963708 Sum: 692.82510 Total: 745 RMSE: 0.964348 Sum: 691.59330 Total: 745 RMSE: 0.963490 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.84690 Total: 745 RMSE: 0.963667 Sum: 690.47330 Total: 745 RMSE: 0.962710 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 693.27970 Total: 745 RMSE: 0.964664 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 Sum: 691.90610 Total: 745 RMSE: 0.963708 -rw-r--r-- 1 a.1.1.out -rw-r--r-- 1 a.1.2.out -rw-r--r-- 1 a.1.4.out -rw-r--r-- 1 a.1.7.out -rw-r--r-- 1 a.1.9.out -rw-r--r-- 1 a.2.1.out -rw-r--r-- 1 a.2.2.out -rw-r--r-- 1 a.2.4.out -rw-r--r-- 1 a.2.7.out -rw-r--r-- 1 a.2.9.out -rw-r--r-- 1 a.4.1.out -rw-r--r-- 1 a.4.2.out -rw-r--r-- 1 a.4.4.out -rw-r--r-- 1 a.4.7.out -rw-r--r-- 1 a.4.9.out -rw-r--r-- 1 a.7.1.out -rw-r--r-- 1 a.7.2.out -rw-r--r-- 1 a.7.4.out -rw-r--r-- 1 a.7.7.out -rw-r--r-- 1 a.7.9.out -rw-r--r-- 1 a.9.1.out -rw-r--r-- 1 a.9.2.out -rw-r--r-- 1 a.9.4.out -rw-r--r-- 1 a.9.7.out -rw-r--r-- 1 a.9.9.out I copy to src/mpp-mpred-3.2.0/dotouts. In dotouts is a script, createtablermse: #!/bin/bash for g in .1 .2 .4 .7 .9 do; for h in .1 .2 .4 .7 .9 do grep RMSE:\ a$g$h.out >> rmse createconfigs script in src/mpp-mpred-3.2.0/p95/mu11 #!/bin/bash for g in .1 .2 .4 .7 .9 do sed -i -e "s/dMNsdsThr=[^ ]*/dMNsdsThr=$g/" t.config for h in .1 .2 .4 .7 .9 do sed -i -e "s/dMNsdsExp=[^ ]*/dMNsdsExp=$h/" t.config cp t.config configs/a$g$h.config

  17. a b g d e f g h i j k l m n o p q r s t u v w x y z a b g d e f g h i j k l m n o p q r s t u v w x y z mF1 mF1 USERS M O V I E S USERS M O V I E S 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 3 3 3 5 2 5 3 5 2 5 1 2 3 5 3 4 3 3 5 5 3 5 4 3 4 2 3 2 2 5 3 1 2 3 4 1 3 4 1 4 3 2 4 3 5 2 5 3 1 3 3 3 4 5 3 5 3 2 3 2 4 1 1 1 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 5 2 5 3 5 2 5 1 2 3 5 3 4 3 3 5 5 3 5 4 3 4 2 3 2 2 5 3 1 2 3 4 1 3 4 1 4 3 2 4 3 5 2 5 3 1 3 3 3 4 5 3 5 3 2 3 2 4 1 1 1 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 uF1 uF1 mF1 Back propagate to update the purple and the green (for Feature #1) using slide 9 methods to get epoch 2:

  18. SVD3 Run1 with L=0.001 and B=0 and F=2: (teaching 1 F at a time) mse f=1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.53 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 1.804 0 1 1 0 0 1 0 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 1.804 f=2 (continuing from where we left off with f1) 0 1 1 0 0 1 0 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 1.804 0 0 0 ****0 ******0 **0 ******0 **0 0 ******0 0 0 3 3 3 3 2 2 3 3 Run1 with L=0.001, B=0, F=3: (teaching 1 F at a time w consistent data) 3 3 3 3 3 3 3 2 2 2 2 2 2 2 4 4 4 4 4 4 5 5 5 1 1 1 1 3 3 3 3 2 2 2 2 2 3 3 3 3 3 f=1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ***3 3 3 3 3 8.219 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ***3 2 3 3 3 1.121 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 ***3 2 3 2 3 1.120 f=2 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 ***3 2 3 2 3 1.119 0 0 ******0 ********0 ****0 ****0 0 0 0 ********0 ** ***3 2 2 2 2 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 ***3 2 3 2 3 1.119 0 0 ******0 ********0 ****0 ****0 0 0 0 ********0 ** ***3 2 2 2 2 f=3 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 ***3 2 3 2 3 1.119 0 0 ******0 ********0 ****0 ****0 0 0 0 ********0 ** ***3 2 2 2 2 0 0 ******0 ********0 ****0 ****0 0 0 0 ********0 ** ***3 2 2 2 2 Run1 with L=0.001 and B=0 and F=2: (teaching 1 F at a time) mse f=2 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 11.53 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1.804 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 2 3 3 2 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 1.804 0 1 1 0 0 1 0 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 2 3 3 2 0 0 0 ****0 ******0 **0 ******0 **0 0 **0 **0 0 0 3 3 3 3 2 3 3 2 1.804 0 1 1 0 0 1 0 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 2 3 3 2 Run1 with L=0.001, B=0, F=2: (teaching 1 F at a time w consistent data) 3 3 3 3 3 1 1 3 5 3 3 5 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 5 5 3 3 3 3 3 3 3 3 1 3 1 mse f=1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 8.902 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 0.780 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 2 3 3 3 3 3 3 2 0.779 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 2 3 3 3 3 3 3 2 0.779 f=2 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 2 3 3 3 3 3 3 2 0.779 0 ****0 ****0 0 0 ****0 ****0 ******0 0 **********2 3 2 2 2 3 2 2 Run1 with L=0.001, B=0, F=3: (same data) fixing f=2, f=3 while calc f=1 f=1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ***3 2 3 3 3 1.121 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 ***3 2 3 2 3 1.120 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.118 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.116 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.114 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.112 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.111 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.109 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.107 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.105 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.103 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.102 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.100 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.098 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.096 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.095 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.093 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.091 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.089 Run1 with L=0.1, B=0, F=3: (same data) f=1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ***3 3 3 3 3 8.219 0 2 0 0 0 1 1 1 0 1 1 0 1 1 1 1 1 1 2 0 1 1 0 1 1 0 ***3 3 3 3 3 1.058 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 ** ***4 3 3 3 3 0.973 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 ** ***4 3 3 3 3 0.973 f=2

  19. Run1 with L=0.001, B=0, F=3: (teaching 1 F at a time w consistent data) repeated 3 3 3 3 3 3 3 2 2 2 2 2 2 2 4 4 4 4 4 4 5 5 5 1 1 1 1 3 3 3 3 2 2 2 2 2 3 3 3 3 3 SVD4 Run1 with L=0.001, B=0, F=3: (same data) fixing f=2, f=3 while calc f=1 f=1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ***3 2 3 3 3 1.121 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 ***3 2 3 2 3 1.120 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.118 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.116 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.114 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.112 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.111 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.109 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.107 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.105 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.103 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.102 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.100 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.098 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.096 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.095 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.093 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.091 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 1.089 ... **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.991 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.989 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.988 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.986 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.984 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.983 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.981 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.980 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.978 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.977 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.975 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.973 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.972 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.970 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.969 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.967 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.966 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.964 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.963 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.961 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.960 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.958 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.956 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.955 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.953 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.952 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.950 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.949 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.947 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.946 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.944 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.943 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.941 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.940 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.938 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.937 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.935 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.934 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.932 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.931 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.929 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.928 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.926 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.925 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.923 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.922 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.921 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.919 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.918 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.916 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.915 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.913 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.912 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.910 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.909 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.907 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.906 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.905 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.903 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.902 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.900 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.899 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.897 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.896 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.895 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.893 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.892 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.890 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.889 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.888 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.886 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.885 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.883 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.882 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.881 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.879 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.878 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.876 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.875 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.874 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.872 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.871 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.870 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.868 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.867 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.865 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.864 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.863 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.861 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.860 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.859 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.857 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.856 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.855 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.853 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.852 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.851 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.849 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.848 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.847

  20. Run1 with L=0.001, B=0, F=3: (same data) fixing f=2, f=3 while calc f=1 f=1 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.845 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.844 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.843 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.841 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.840 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.839 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.837 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.836 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.835 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.833 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.832 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.831 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.830 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.828 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.827 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.826 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.824 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.823 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.822 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.821 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.819 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.818 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.817 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.815 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.814 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.813 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.812 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.810 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.809 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.808 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.807 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.805 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.804 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.803 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.802 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.800 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.799 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.798 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.797 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.795 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.794 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.793 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.792 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.790 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.789 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.788 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.787 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.786 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.784 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.783 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.782 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.781 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.780 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.778 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.777 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.776 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.775 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.774 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.772 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.771 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.770 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.769 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.768 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.766 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.765 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.764 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.763 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.762 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.761 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.759 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.758 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.757 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.756 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.755 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.754 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.752 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.751 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.750 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.749 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.748 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.747 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.745 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.744 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.743 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.742 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.741 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.740 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.739 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.738 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.736 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.735 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.734 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.733 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.732 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.731 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.730 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.729 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.727 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.726 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.725 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.747 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.745 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.744 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.743 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.742 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.741 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.740 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.739 Run1 with L=0.001, B=0, F=3: (teaching 1 F at a time w consistent data) repeated 3 3 3 3 3 3 3 2 2 2 2 2 2 2 4 4 4 4 4 4 5 5 5 1 1 1 1 3 3 3 3 2 2 2 2 2 3 3 3 3 3 SVD4 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.738 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.736 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.735 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.734 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.733 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.732 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.731 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.730 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.729 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.727 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.726 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.725 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.724 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.723 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.722 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.721 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.720 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.719 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.718 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.716 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.715 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.714 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.713 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.712 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.711 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.710 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.709 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.708 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.707 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.706 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.705 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.703 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.702 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.701 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.700 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.699 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.698 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.697 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.696 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.695 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.694 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.693 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.692 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.691 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.690 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.689 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.688 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.687 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.686 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.685 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.684 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.683 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.682 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.681 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.679 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.678 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.677 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.676 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.675 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.674 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.673 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.672 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.671 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.670 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.669 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.668 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.667 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.666 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.665 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.664 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.663 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.662 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.661 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.660 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.659 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.658 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.657 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.656 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.655 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.654 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.654 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.653 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.652 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.651 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.650 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.649 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.648 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.647 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.646 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.645 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.644 **1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 ***3 2 3 2 3 0.643

  21. 0.371 0.370 0.370 0.369 0.369 0.368 0.368 0.368 0.367 0.367 0.366 0.366 0.365 0.365 0.364 0.364 0.363 0.363 0.363 0.362 0.362 0.361 0.361 0.360 0.360 0.359 0.359 0.359 0.358 0.358 0.357 0.357 0.356 0.356 0.356 0.355 0.355 0.354 0.354 0.353 0.353 0.353 0.352 0.352 0.351 0.351 0.350 0.350 0.350 0.349 0.349 0.348 0.348 0.347 0.347 0.347 0.346 0.346 0.345 0.345 0.345 0.344 0.344 0.343 0.343 0.342 0.342 0.342 0.341 0.341 0.340 0.340 0.340 0.339 0.339 0.338 0.338 0.338 0.337 0.337 0.336 0.336 0.336 0.335 0.335 0.334 0.334 0.334 0.333 0.333 0.332 0.332 0.332 0.331 0.331 0.330 0.330 0.330 0.329 0.329 0.329 0.328 0.328 0.327 0.327 0.327 0.326 0.326 0.325 0.325 0.325 0.324 0.324 0.324 0.323 0.323 0.322 0.322 0.322 0.321 0.321 0.321 0.320 0.320 0.319 0.319 0.319 0.318 0.318 0.318 0.317 0.317 0.317 0.316 0.316 0.315 0.315 0.315 0.314 0.314 0.314 0.313 0.313 0.313 0.312 0.312 0.312 0.311 0.311 0.310 0.310 0.310 0.309 0.309 0.309 0.308 0.308 0.308 0.307 0.307 0.283404 0.282798 0.282495 0.282193 0.281892 0.281591 0.281291 0.280992 0.280692 0.280394 0.280096 0.279798 0.279501 0.279205 0.278909 0.278613 0.278319 0.278024 0.277730 0.277437 0.277144 0.276852 0.276560 0.276269 0.275978 0.275688 0.275399 0.275110 0.274821 0.274533 0.274245 0.273958 0.273672 0.273386 0.273100 0.272815 0.272531 0.272247 0.271963 0.271680 0.271398 0.271116 0.270834 0.270553 0.270273 0.269993 0.269713 0.269434 0.269156 0.268878 0.268601 0.268324 0.268047 0.267771 0.267496 0.267221 0.266946 0.266672 0.266399 0.266126 0.265853 0.265581 0.265309 0.265038 0.264768 0.264498 0.264228 0.263959 0.263690 0.263422 0.263154 0.262887 0.262620 0.262354 0.262088 0.261823 0.261558 0.261293 0.261029 0.260766 0.260503 0.260241 0.259978 0.259717 0.259456 0.259195 0.258935 0.258675 0.258416 0.258157 0.257899 0.257641 0.257384 0.257127 0.256870 0.256614 0.256359 0.256104 0.255849 0.255595 0.255341 0.255088 0.254835 0.254582 0.254330 0.254079 0.253828 0.253577 0.253327 0.253077 0.252828 0.252579 0.252331 0.252083 0.251836 0.251589 0.251342 0.251096 0.250850 0.250605 0.250360 0.250115 0.249872 0.249628 0.249385 0.249142 0.248900 0.248658 0.248417 0.248176 0.247935 0.247695 0.247456 0.247216 0.246977 0.246739 0.246501 0.246264 0.246026 0.245790 0.245554 0.245318 0.245082 0.244847 0.244613 0.244378 0.244145 0.243911 0.243678 0.243446 0.243214 0.242982 0.242751 0.242520 0.242290 0.242060 0.241830 0.241601 0.241372 0.241144 0.240916 0.240688 0.240461 0.240234 0.240008 0.239782 0.239556 0.239331 0.239106 0.238882 0.238658 0.238434 0.238211 0.237988 0.237766 0.237544 0.237322 0.237101 0.236880 0.236660 0.236440 0.236220 0.236001 0.235782 0.235564 0.235346 0.235128 0.234911 0.234694 0.234477 0.234261 0.234046 0.233830 0.233615 0.233401 0.233186 0.232973 0.232759 0.232546 0.232333 0.232121 0.231909 0.231698 0.231486 0.231276 0.231065 0.230855 0.230645 0.230436 0.230227 0.230019 0.229810 0.229603 0.229395 0.229188 0.228981 0.228775 0.228569 0.228363 0.228158 0.227953 0.227749 0.227545 0.227341 0.227137 0.226934 0.226732 0.226529 0.226327 0.226126 0.225924 0.225723 0.225523 0.225323 0.225123 0.224923 0.224724 0.224525 0.224327 0.224129 0.223931 0.223734 0.223537 0.223340 0.223144 0.222948 0.222752 0.222557 0.222362 0.222167 0.221973 0.221779 0.221586 0.221392 0.221199 0.221007 0.220815 0.220623 0.220431 0.220240 0.220049 0.219859 0.219669 0.219479 0.219289 0.219100 0.218912 0.218723 0.218535 0.218347 0.218160 0.217973 0.217786 0.217599 0.217413 0.217227 0.217042 0.216857 0.216672 0.216487 0.216303 0.216119 0.215936 0.215753 0.215570 0.215387 0.215205 0.215023 0.214842 0.214660 0.214479 0.214299 0.214119 0.213939 0.213759 0.213580 0.213401 0.213222 0.213043 0.212865 0.212688 0.212510 0.212333 0.212156 0.211980 0.211804 0.211628 0.211452 0.211277 0.211102 0.210928 0.210753 0.210579 0.210406 0.210232 0.210059 0.209886 0.209714 0.209542 0.209370 0.209198 0.209027 0.208856 0.208685 0.208515 0.208345 0.208175 0.208006 0.207837 0.207668 0.207499 0.207331 0.207163 0.206996 0.206828 0.206661 0.206494 0.206328 0.206162 0.205996 0.205830 0.205665 0.205500 0.205335 0.205171 0.205007 0.204843 0.204680 0.204516 0.204353 0.204191 0.204028 0.203866 0.203704 0.203543 0.203382 0.203221 0.203060 0.202900 0.202740 0.202580 0.202420 0.202261 0.202102 0.201943 0.201785 0.201627 0.201469 0.201311 0.201154 0.200997 0.200840 0.200684 0.200528 0.200372 0.200216 0.200061 0.199906 0.199751 0.199597 0.199442 0.199288 0.199135 0.198981 0.198828 0.198675 0.198523 0.198370 0.198218 0.198066 0.197915 0.197763 0.197612 0.197462 0.197311 0.197161 0.197011 0.196861 0.196712 0.196563 0.196414 0.196265 0.196117 0.195969 0.195821 0.195673 0.195526 0.195379 0.195232 0.195086 0.194940 0.194793 0.194648 0.194502 0.194357 0.194212 0.194067 0.193923 0.193779 0.193635 0.193491 0.193347 0.193204 0.193061 0.192919 0.192776 0.192634 0.192492 0.192350 0.192209 0.192068 0.191927 0.191786 0.191646 0.191505 0.191365 0.191226 0.191086 0.190947 0.190808 0.190669 0.190531 0.190393 0.190255 0.190117 0.189979 0.189842 0.189705 0.189568 0.189432 0.189295 0.189159 0.189024 0.188888 0.188753 0.188618 0.188483 0.188348 0.188214 0.188080 0.187946 0.187812 0.187679 0.187545 0.187412 0.187280 0.187147 0.187015 0.186883 0.186751 0.186619 0.186488 0.186357 0.186226 0.186095 0.185965 0.185835 0.185705 0.185575 0.185446 0.185316 0.185187 0.185059 0.184930 0.184802 0.184674 0.184546 0.184418 0.184290 0.184163 0.184036 0.183909 0.183783 0.183657 0.183530 0.183405 0.183279 0.183153 0.183028 0.182903 0.182778 0.182654 0.182530 0.182405 0.182281 0.182158 0.182034 0.181911 0.181788 0.181665 0.181543 0.181420 0.181298 0.181176 0.181054 0.180933 0.180811 0.180690 0.180569 0.180449 0.180328 0.180208 0.180088 0.179968 0.179849 0.179729 0.179610 0.179491 0.179372 0.179254 0.179135 0.179017 0.178899 0.178781 0.178664 0.178547 0.178429 0.178313 0.178196 0.178079 0.177963 0.177847 0.177731 0.177615 0.177500 0.177385 0.177270 0.177155 0.177040 0.176926 0.176811 0.176697 0.176583 0.176470 0.176356 0.176243 0.176130 0.176017 0.175904 0.175792 0.175680 0.175568 0.175456 0.175344 0.175232 0.175121 0.175010 0.174899 0.174788 0.174678 0.174568 0.174458 0.174348 0.174238 0.174128 0.174019 0.173910 0.173801 0.173692 0.173584 0.173475 0.173367 0.173259 0.173151 0.173043 0.172936 0.172829 0.172722 0.172615 0.172508 0.172402 0.172295 0.172189 0.172083 0.171977 0.171872 0.171766 0.171661 0.171556 0.171451 0.171347 0.171242 0.171138 0.171034 0.170930 0.170826 0.170723 0.170619 0.170516 0.170413 0.170310 0.170208 0.170105 0.170003 0.169901 0.169799 0.169697 0.169596 0.169494 0.169393 0.169292 0.169191 0.169090 0.168990 0.168890 0.168789 0.168689 0.168590 0.168490 0.168391 0.168291 0.168192 0.168093 0.167995 0.167896 0.167798 0.167699 0.167601 0.167503 0.167406 0.307 0.306 0.306 0.306 0.305 0.305 0.305 0.304 0.304 0.304 0.303 0.303 0.302 0.302 0.302 0.301 0.301 0.301 0.300 0.300 0.300 0.299 0.299 0.299 0.298 0.298 0.298 0.297 0.297 0.297 0.296 0.296 0.296 0.295 0.295 0.295 0.295 0.294 0.294 0.294 0.293 0.293 0.293 0.292 0.292 0.292 0.291 0.291 0.291 0.290 0.290 0.290 0.289 0.289 0.289 0.288 0.288 0.288 0.288 0.287 0.293 0.293 0.293 0.292 0.292 0.292 0.291 0.291 0.291 0.290 0.290 0.290 0.289 0.289 0.289 0.288 0.288 0.288 0.288 0.287 0.642 0.641 0.640 0.639 0.638 0.637 0.636 0.635 0.634 0.633 0.632 0.632 0.631 0.630 0.629 0.628 0.627 0.626 0.625 0.624 0.623 0.622 0.621 0.620 0.620 0.619 0.618 0.617 0.616 0.615 0.614 0.613 0.612 0.611 0.610 0.610 0.609 0.608 0.607 0.606 0.605 0.604 0.603 0.602 0.602 0.601 0.600 0.599 0.598 0.597 0.596 0.595 0.595 0.594 0.593 0.592 0.591 0.590 0.589 0.589 0.588 0.587 0.586 0.585 0.584 0.583 0.583 0.582 0.581 0.580 0.579 0.578 0.578 0.577 0.576 0.575 0.574 0.573 0.573 0.572 0.571 0.570 0.569 0.568 0.568 0.567 0.566 0.565 0.564 0.564 0.563 0.562 0.561 0.560 0.560 0.559 0.558 0.557 0.556 0.556 0.555 0.554 0.553 0.552 0.552 0.551 0.550 0.549 0.548 0.548 0.547 0.546 0.545 0.545 0.544 0.543 0.542 0.541 0.541 0.540 0.539 0.538 0.538 0.537 0.536 0.535 0.535 0.534 0.533 0.532 0.532 0.531 0.530 0.529 0.529 0.528 0.527 0.526 0.526 0.525 0.524 0.523 0.523 0.522 0.521 0.520 0.520 0.519 0.518 0.517 0.517 0.516 0.515 0.515 0.514 0.513 0.512 0.512 0.511 0.510 0.510 0.509 0.508 0.507 0.507 0.506 0.505 0.505 0.504 0.503 0.502 0.502 0.501 0.500 0.500 0.499 0.498 0.498 0.497 0.496 0.496 0.495 0.494 0.493 0.493 0.492 0.491 0.491 0.490 0.489 0.489 0.488 0.487 0.487 0.486 0.485 0.485 0.484 0.483 0.483 0.482 0.481 0.481 0.480 0.479 0.479 0.478 0.477 0.477 0.476 0.475 0.475 0.474 0.474 0.473 0.472 0.472 0.471 0.470 0.470 0.469 0.468 0.468 0.467 0.466 0.466 0.465 0.465 0.464 0.463 0.463 0.462 0.461 0.461 0.460 0.460 0.459 0.458 0.458 0.457 0.456 0.456 0.455 0.455 0.454 0.453 0.453 0.452 0.452 0.451 0.450 0.450 0.449 0.449 0.448 0.447 0.447 0.446 0.446 0.445 0.444 0.444 0.443 0.443 0.442 0.441 0.441 0.440 0.440 0.439 0.438 0.438 0.437 0.437 0.436 0.436 0.435 0.434 0.434 0.433 0.433 0.432 0.431 0.431 0.430 0.430 0.429 0.429 0.428 0.428 0.427 0.426 0.426 0.425 0.425 0.424 0.424 0.423 0.422 0.422 0.421 0.421 0.420 0.420 0.419 0.419 0.418 0.418 0.417 0.416 0.416 0.415 0.415 0.414 0.414 0.413 0.413 0.412 0.412 0.411 0.411 0.410 0.409 0.409 0.408 0.408 0.407 0.407 0.406 0.406 0.405 0.405 0.404 0.404 0.403 0.403 0.402 0.402 0.401 0.401 0.400 0.400 0.399 0.398 0.398 0.397 0.397 0.396 0.396 0.395 0.395 0.394 0.394 0.393 0.393 0.392 0.392 0.391 0.391 0.390 0.390 0.389 0.389 0.388 0.388 0.387 0.387 0.386 0.386 0.385 0.385 0.385 0.384 0.384 0.383 0.383 0.382 0.382 0.381 0.381 0.380 0.380 0.379 0.379 0.378 0.378 0.377 0.377 0.376 0.376 0.375 0.375 0.374 0.374 0.374 0.373 0.373 0.372 0.372 0.371 0.167308 0.167211 0.167113 0.167016 0.166920 0.166823 0.166726 0.166630 0.166534 0.166438 0.166342 0.166246 0.166151 0.166055 0.165960 0.165865 0.165770 0.165676 0.165581 0.165487 0.165393 0.165299 0.165205 0.165111 0.165017 0.164924 0.164831 0.164738 0.164645 0.164552 0.164460 0.164367 0.164275 0.164183 0.164091 0.163999 0.163908 0.163816 0.163725 0.163634 0.163543 0.163452 0.163362 0.163271 0.163181 0.163091 0.163001 0.162911 0.162821 0.162732 0.162642 0.162553 0.162464 0.162375 0.162286 0.162198 0.162109 0.162021 0.161933 0.161845 0.161757 0.161670 0.161582 0.161495 0.161408 0.161321 0.161234 0.161147 0.161060 0.160974 0.160888 0.160801 0.160715 0.160630 0.160544 0.160458 0.160373 0.160288 0.160203 0.160118

  22. 0.0542006716 mse 0.34 L 0.0536533858 mse 0.35 L 0.0531393767 mse 0.36 L 0.0526568144 mse 0.37 L 0.0522041129 mse 0.38 L 0.0517799953 mse 0.39 L 0.0513835046 mse 0.4 L 0.0510140156 mse 0.41 L 0.0506712513 mse 0.42 L 0.050355306 mse 0.43 L 0.0500666802 mse 0.44 L 0.0498063332 mse 0.45 L 0.0495757618 mse 0.46 L 0.0493771166 mse 0.47 L 0.0492133722 mse 0.48 L 0.0490885767 mse 0.49 L 0.0490082227 mse 0.5 L 0.0489798039 mse 0.51 L 0.0490136739 mse 0.52 L 0.153708 0.153635 0.153561 0.153488 0.153415 0.153342 0.153269 0.153196 0.153123 0.153051 0.152978 0.152906 0.152834 0.152762 0.152690 0.152618 0.152546 0.152475 0.152403 0.152332 0.152261 0.152190 0.152119 0.152048 0.151977 0.151907 0.151836 0.151766 0.151696 0.151625 0.151556 0.151486 0.151416 0.151346 0.151277 0.151208 0.151138 0.151069 0.151000 0.150931 0.150863 0.150794 0.150725 0.150657 0.150589 0.150521 0.150453 0.150385 0.150317 0.150249 0.150182 0.150114 0.150047 0.149980 0.149912 0.149845 0.149779 0.149712 0.149645 0.149579 0.149512 0.149446 0.149380 0.149314 0.149248 0.149182 0.149116 0.149051 Increasing LRATE to 0.01 from 0.001 0.147850 0.146927 0.146148 0.145445 0.144784 0.144150 0.143534 0.142931 0.142341 0.141762 0.141193 0.140633 0.140083 0.139543 0.139011 0.138489 0.137975 0.137470 0.136974 0.136485 . . . 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 0.106839 Leaving out the middle. Basically I let it run until I run out of memory. Nexit I will increase LRATE to 0.1 from 0.01 0.0898592086 0.0846511859 0.0829353389 0.0822706207 0.0819343634 0.0817115209 0.0815369759 0.0813898592 0.0812624451 0.0811510306 0.0810532585 0.0809673215 0.0808917131 0.0808251376 0.0807664694 0.080714727 0.0806690537 0.0806287013 0.0805930165 0.0805614284 ... 0.0802899383 0.0802899383 0.0802899383 0.0802899383 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.080289938 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 0.0802899382 Hven't reached 123precis prog so change not likely. However, I will try increasing LRATE one last time to .5 from .1 (goes crazy! increases at a fast pace. Tried .2 and it still increases. Tried .11 and I get more progress: 0.0802899382 0.0788242047 0.0784533108 0.0783577171 0.0783316479 0.0783232942 0.0783195916 0.0783172305 0.0783153584 0.0783137431 0.0783123107 0.078311029 0.0783098779 0.078308842 0.0783079083 0.0783070656 0.0783063039 0.0783056145 0.0783049899 0.0783044232 0.0783039086 0.0783034407 0.0783030148 0.0783026268 0.078302273 0.0783019501 0.0783016552 0.0783013856 0.078301139 0.0783009132 0.0783007064 0.0783005169 0.078300343 0.0783001835 0.0783000371 0.0782999026 0.078299779 ... 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 0.0782983319 Stalls out! LRATE=.12 0.0768909547 0.0765587432 0.0764790778 0.0764590015 0.0764531174 0.0764507257 0.0764492893 0.0764442156 0.0764428662 0.0764426213 0.0764424041 0.0764404923 0.0764404923 0.0764404923 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 0.0764404922 Stalls out! LRATE=.13 0.076440492. 0.0764404922 0.0764404922 0.0747107765 0.0747069772 0.0747057576 0.0747045829 0.0747043987 0.0747042654 0.0747041719 0.0747041098 0.0747040722 0.0747040538 0.0747040502 0.0747040577 0.0747040737 0.0747040957 Starting back up! Is .0747040502 the minimum? LRATE=.135 0.0740507897 0.0739131326 0.0738842009 0.0738782005 0.0738770431 0.0738769122 0.0738770004 0.0738771348 0.0738772773 Starting back up!\Try LRATE=0.14 0.0732377494 0.073107742 0.0730815726 0.0730765398 0.0730758055 0.0730759389 L=.15? 0.0718314151 0.0715954006 0.0715514061 0.0715439793 0.0715435104 0.0715443478 L=.16? 0.0703544476 0.0701451315 0.0701101489 0.0701058986 0.0701069786 L=.17? 0.0689586661 0.0687736762 0.0687469058 0.0687460112 0.068749175 L=.18? 0.0676467887 0.0674848715 0.0674658631 0.0674682771 L=19? 0.0664142817 0.0662739185 0.0662619449 L=.2? 0.0652496444 0.0651293744 0.0651240092 L=.21 0.0641544022 0.0640527248 L=.22 0.0631022234 0.0630178215 L=.23 0.0621088551 0.0620424507 L=.24 0.0611727843 0.0611230862 L=.25 0.0602903976 0.060256004 L=.26 0.0594580716 L=.27 0.0586722655 0.0586642026 L=.28 0.0579296228 L=.29 0.0571918706 L=.3 0.0565356393 L=.31 0.0552415815 L=.33 0.054648410 0.160033 0.159948 0.159864 0.159780 0.159695 0.159611 0.159527 0.159444 0.159360 0.159277 0.159193 0.159110 0.159027 0.158944 0.158861 0.158779 0.158696 0.158614 0.158532 0.158450 0.158368 0.158286 0.158205 0.158123 0.158042 0.157961 0.157880 0.157799 0.157718 0.157638 0.157557 0.157477 0.157397 0.157317 0.157237 0.157157 0.157077 0.156998 0.156919 0.156839 0.156760 0.156681 0.156603 0.156524 0.156445 0.156367 0.156289 0.156211 0.156133 0.156055 0.155977 0.155900 0.155822 0.155745 0.155668 0.155591 0.155514 0.155437 0.155361 0.155284 0.155208 0.155132 0.155055 0.154979 0.154904 0.154828 0.154752 0.154677 0.154602 0.154527 0.154452 0.154377 0.154302 0.154227 0.154153 0.154078 0.154004 0.153930 0.153856 0.153782 .0489798039=mse .51=L 1.2076503167 1 1 1 1 0.5483281315 1.1217606792 1 1 1 1 1 1 0.5483281315 1.1217606792 1 0.5483281315 1 0.5483281315 0 1 1 1 1 0.5483281315 1.1217606792 3 2 4.2174875551 5 0.6050789973 3 2 3 a b c d e f g h i j k l m n o p q r s t u v w x y z 1 2 3 4 5 6 7 8 T 1 2 3 4 5 6 7 8 a b c d e f g h i j k l m n o p q r s t u v w x y z 3 3 3 3 3 3 3 2 2 2 2 2 2 2 4 4 4 4 4 4 5 5 5 1 1 1 1 3 3 3 3 2 2 2 2 2 3 3 3 3 3

  23. 3 3 3 3 2 2 2 2 4 4 4 5 5 1 1 3 3 2 2 2 3 3 a b c d e f g h i j k l m n .37 1 1 1 1 1 1 1 1 1 1 1 1 1 .31 0 0 0 0 0 0 0 0 0 0 0 0 0 .31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 .54 -.22 -.22 0 0 -1.1 .39 0 0 0 0 0 0 0 3 3 3 1 3 3 3 2 2 2 2 2 3 3 4 4 4 3 4.21 3 3 5 4 5 3 3 1 1 5 .61 3 3 3 3 6 3 3 3 2 2 7 2 3 3 3 3 3 8 3 3 3 o p q r s t u v w x y z 1 1 1 1 1 0 1 1 1 1 1 1 f1 0 0 0 0 0 0 0 0 0 0 0 0 f2 0 0 0 0 0 0 0 0 0 0 0 0 f3 0 0 0 1 e 0 0 0 2 r -.22 -.22 -.22 3 r 0 4 o .39 .39 5 r 0 0 6 s 0.51=LRATE 0 0 7 0=Bias 0 0 0 8 .0489798039 mse a b c d e f g h i j k l m n .22 1 1 1 1 1 1 1 1 1 1 1 1 1 0.38 0 0 0 0 0 0 0 0 0 0 0 0 0 0.38 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 .75 -.1 -.1 0 0 -1.4 0.27 0 0 0 0 0 0 0 a b c d e f g h i j k l m n .20 1 1 1 1 1 1 1 1 1 1 1 1 1 0.39 0 0 0 0 0 0 0 0 0 0 0 0 0 0.39 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 .78 -.08 -.08 0 0 -1.5 0.25 0 0 0 0 0 0 0 o p q r s t u v w x y z 1 1 1 1 1 0 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 0 0 0 2 3 3 -.1 -.1 -.1 4.09 3.00 3.00 0 5 3 3 0.27 .27 .72 2.99 2.99 0 0 3 3 3 0 0 2 3 3 0 0 0 3 3 3 o p q r s t u v w x y z 1 1 1 1 1 0 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 3 0 0 0 2 3 3 -.08 -.08 -.08 4.08 3.00 3.00 0 5 3 3 0.25 0.25 .74 2.99 2.99 0 0 3 3 3 0 0 2 3 3 0 0 0 3 3 3 .0674735658 mse .18 L 0 B .0715522634 mse 0.15 L 0 B

More Related