|
I'm thinking about doing some work in Natural Language Generation. What should I be reading? What are the big problems? What's the most interesting paper you've read in the area? |
|
The go-to book on this is "Building Natural Language Generation Systems" (Reiter and Dale, 2000). The problem that's gotten the most interest in NLG is the generation of referring expressions. See "Computational Interpretations of the Gricean Maxims in the Generation of Referring Expressions" (Dale and Reiter, 1995) and "Graph-based generation of referring expressions" (Krahmer et al., 2003). Most recently, there's been a big push towards generating language that is multi-modal (e.g., incorporating gesture and gaze) as well as generating language that is psycholinguistically-motivated. Awesome! Thank Margaret.
(Jul 19 '10 at 12:47)
aria42
|
|
I thought the White et al. 2010 CL paper was really cool, "Generating tailored, comparative descriptions with contextually appropriate intonation". http://aclweb.org/anthology-new/J/J10/J10-2001.pdf. The part that really grabbed me was listening to the stimuli for their human rating experiment: http://www.ling.ohio-state.edu/~mwhite/flights-stimuli/. The gen and all wavs are the baseline systems, and apml is the experiment system. The experiment system pushes information structure information to the speech generator to generate prosodic "tunes" that (imho) sound natural. Prosody and information theory are some of the linguistic areas I find most difficult to get my head around, so I was impressed to see it work in a system. |