1 / 3

Question 1 How could a PCFG learner be a model of human syntax learning?

Hi all. Sorry this is late. I'm giving an extra day, so that the due date is now: 6AM January 3 rd . Please don't email me after this! (Grades are due on January 4 th) . Answer all three questions. 500-700 words per question. No more please. Please use:

aspen
Download Presentation

Question 1 How could a PCFG learner be a model of human syntax learning?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hi all. Sorry this is late. I'm giving an extra day, so that the due date is now: 6AM January 3rd. Please don't email me after this! (Grades are due on January 4th). Answer all three questions. 500-700 words per question. No more please. Please use: 12pt point font, double-spaced, (large) 2" margins. The layout will look awful, but it will be easy for me to scribble comments on! I'll be more attentive to email this weekend, then again after Christmas Day for questions. I'll be in my office on Dec. 26th in the morning and (very) early afternoon for appointments. Please, email first so that there are no conflicts. Have a good holiday everybody.

  2. Question 1 • How could a PCFG learner be a model of human syntax learning? • Assume (not realistically): • Every human language can be described using a CFG. • There is no lexicon. The input to the learner is a string of word-categories (like Gibson and Wexler's TLA study. I.e. SVO, SauxVO, etc) • Before learning commences, the model has access to a stored collection rules that is exactly the union of the rules that make up each and every human grammar. Call this GU. For example, if G1 = {rules that make up the grammar of English} and G2 = {rules that make up the grammar of Russian} and G3 = {rules that make up the grammar of Japanese} . . . Gn = {rules that make up the grammar of the last language in the list of possible human languages} Then GU = G1 U G2 U G3 U . . . . U Gn . • How would the learning process proceed? What would be considered convergence? How could the model be tested? What is the relationship between parsing and learning (Hint: compare with Fodor's STL model)? What is the relation between GU and UG? What problems might you foresee? Is this a cross-linguistic learning model (like the parameter setting models) or a mono-lingual model (like Elman's SRN?)

  3. Question 2 PCFG's, HMM taggers and bag-of-words similarity-based retrieval/clustering techniques all incorporate different amounts of linguistic knowledge (LK for our purposes here). Compare and contrast two of the three models. For each, be sure to discuss a) to what extent an algorithm is stochastic and/or statistical, b) the type and amount of LK utilized by the algorithm, c) the structure of the input sample that the model was tested on and d) the effect of (a), (b) and (c) on the model’s reported success(es). Please include a brief outline of each model’s design, but focus on the design features that contribute to your answers to (a), (b), (c) and (d). Question 3 Explain how you might apply an artificial neural network to do tagging. You don't have to give specific weights, but describe the topology you would use, what the input and output nodes would represent, what the training file and supervisor files would look like (you might give a small example of each), etc. Compare your approach to a simple HMM tagger. Make sure that you address why you might expect your model to do better or worse than the HMM tager.

More Related