Undergrad elective; intro grad class. Offered every other Fall. A topics class on information retrieval which also covers basics in natural language processing, indexing and data management, user modelling, machine learning, and quantitative evaluation of search engines. The class combines lectures and written homework, with five programming assignments that lead up to a team-based project during the second half of the semester.
Graduate seminar; implementation-intensive; for students interested in research. Offered every Spring. An implementation-intensive seminar that builds on Information Retrieval, but emphasizes research on automated knowledge bases, natural language processing, graph analysis and information retrieval. In 2021 the course curriculum was updated to respond to the large impact of Neural Networks on the research in this field. The curriculum was adapted to be appropriate for students with knowledge in either IR or ML.
Students produce reading notes on a weekly research paper, create a literature survey on a chosen topic, and develop a prototype on a shared task in teams (through multiple submissions). The course emphasizes reading and producing computer science research, with an emphasis on rigorous quantitative evaluation of which method worked best. It also teaches research communication both written and spoken of existing literature and the student’s own findings.
Required undergrad class; grad theory class. Offered every Spring. This formerly theory-intensive elective is now a mandatory class. The course offers insights into the theoretical underpinnings of computer science. The class connects algorithmic problems such as sorting, indexing, and graph traversal, to theoretical principles via proofs, rigorous analysis of runtime and space completixy and complexity classes such as NP-Complete and Undecitable. It teaches how to design algorithms with dynamic programming, greedy algorithms, and approximation algorithms and deliver necessary correctness proofs.
Undergrad elective; intro Grad class. Offered Every other Fall. A topics class on neural networks with application to machine learning for sequential data domains. The class is self-contained, hence covers the necessary gradient-based optimization methods and classification approaches (such as Logistic Regression and Multi-Layer Percepton), on which more complex sequence models are based on. The class combines lectures, written homework, programming homework, and a class project (team or individual). The class is complementary to other Machine Learning and AI-related classes and focuses on non-interactive data analysis, such as classification, tagging, and generation.
A similar class was previously taught as CS 780/880 Machine Learning for Sequences (ML)
All courses are managed on mycourses.unh.edu
I offer Senior/Honors/Masters theses, Masters projects, and independent studies to supplement the courses I teach. See procedures, expectations, and grading rationale.
I hold a weekly lab meeting with my Ph.D. students, other students in the program, as well as voluntary members (open to anyone), which discusses recent research papers, student present their research results, give rehearsal talks, and discuss participation in international shared tasks. This lab meeting helps the participating Ph.D. students to stay focused and motivated. Several students have formed research-alliances, sharing data, code, and feedback without my intervention.
I also taught several lectures on natural language processing and knowledge graphs for information retrieval at conference tutorials and summer schools:
Computational Social Science Summer School Lecture 2018 on “What is in my documents?”.
Conference tutorial on Utilizing Knowledge Graphs for Text-centric Retrieval at ICTIR 2016, WSDM 2017, SIGIR 2017. (Literature Overview).