Computational Semantics

Spring 2026
Umut Özge (✉️) , Anıl Öğdül (TA, ✉️)
⏰ W 2:40-5:30, II04
Check the current syllabus for course information and policies.
Some exercises on this course may require some helper code. Please visit coursepy on how to get and install the helper code.
Make sure you are on odtuclass to get announcements and updates.
📖: reading | 📝: exercise | 🧩: optional/advanced material

Week Content
1 (18/2)

Course introduction; general discussion on study of language in cogsci.
Language primer;
Schubert (2020);
Chomsky (1957).

2 (25/2)

Goals of linguistic theory; more on the concept of grammar and how it relates to meaning.


Chomsky (1957);

Eisenstein (2019), the rest of the chapter is .


Be prepared to get from a phrase structure grammar to the derivation, and vice versa.

3 (4/3)

Continued on grammar, phrase structure, and various forms of adequacy. Couldn’t discuss Eisenstein, we will, next week.
Ex. 1 from Phrase structure grammar.

4 (11/3)

We continued with grammatical analyses and started to look at how questions are generally approached in machine learning and computational linguistics.
Next week we will go on by having a more detailed look at the learning perspective with a simple example of text classification.
The previous week’s exercise on phrase structure grammars should be solved before next week’s session. There will be a quiz about the topic of the exercise.

5 (18/3)

We discussed the significance of explicit grammar modeling. The take-home message there was that some reqularities in natural language can be discovered only by detailed modeling of the structure of expressions.

This issue was closely related to the trade-off between predictive power and expalanatory power. The more you can predict, the less you can explain, and vice versa.

Solutions to the quiz questions can be found here

6 (25/3)

We started model-theoretic interpretation.

Model-theoretic interpretation