FoMo lecture by Selene Báez Santamaría

Selene Báez Santamaría gave a lecture in the FoMo series, which is a series of seminars highlighting the work being done in Amsterdam around foundation models: large, highly re-usable machine learning models trained on great amounts of data.

Selenes FoMo lecture is entitled ‘A Practical Approach to GPTx Models: Evaluating Task-Specific Performance and Insights’, concerning task-specific performance of large language models. Within research, the API accessibility of such models has made it possible to effortlessly explore these model’s performance on a wide range of scientific tasks. In this talk Selene presented her findings from three specific tasks: knowledge base completion, argument mining, and task-oriented dialogue incorporating subjective knowledge.