Maximum Entropy models in their various forms have been successfully applied to a wide range of Natural Language Processing problems including tagging, parsing, and classification tasks.
In this talk, I will avoid describing the sophisticated theoretical foundations of Maximum Entropy modelling in favour of presenting a clear intuitive understanding. Consider this a taster for my ALTA Summer School course which will cover both the theory and intuition in detail.
I will then describe a range of applications that Stephen Clark and I have obtained state-of-the-art results for using MaxEnt models, including:
More importantly I will try to share some of the practical knowledge that we have discovered along the way which will help you apply MaxEnt modelling to your own problems.
I will also talk about some ongoing experiments that my students and I have been conducting in Sydney using MaxEnt modelling, including:
Finally, I will talk about some directions and applications where I see MaxEnt modelling going next.
James Curran is an ARC Postdoctoral Fellow in the Language Technology Research Group in the School of Information Technologies at the University of Sydney. He has just returned to Australia after completing his Ph.D. in computational lexical semantics at the University of Edinburgh.
His ARC funded project, 'Ask the Net: Intelligent Natural Language Learning', involves automatically asking contributors simple questions via email which will be collected to create annotated data for standard NLP problems, e.g. Named Entity Recognition. An interesting challenge is finding ways of eliciting linguistic knowledge from those without lingustic training. His other research interests range from standard statistical NLP problems such as tagging and parsing, through to system building such as the development of question answering and biological text mining systems.