Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Future Blog Post

less than 1 minute read

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

portfolio

publications

Formal Pragmatics of `Even-If’

Rik Koncel-Kedziorski

Published in University of Washington Working Papers in Linguistics, Vol. 31, 2013

“[T]he purpose of an EVEN-IF is to assert the absence of a non-accidental connection between the antecedent or its negation and the consequent”

PDF

Multi-resolution Language Grounding with Weak Supervision

Rik Koncel-Kedziorski, Hannaneh Hajishirzi, Ali Farhadi

Published in EMNLP 2014, 2014

“Our method makes use of a factorized objective function which allows us to model the complex interplay of resolutions. Our language model takes advantage of the discourse structure of the commentaries, making it robust enough to handle the unique language of the soccer domain.”

PDF

Parsing Algebraic Word Problems into Equations

Rik Koncel-Kedziorski, Hannaneh Hajishirzi, Ashish Sabharwal, Oren Etzioni, Sienna Dumas Ang

Published in TACL 2015, Volume 3, 2015

“ALGES learns to map spans of text to arithmetic operators, to combine them given the global context of the problem, and to choose the “best” tree corresponding to the problem.”

PDF

MAWPS: A Math Word Problem Repository

Rik Koncel-Kedziorski, Subhro Roy, Aida Amini, Nate Kushman, Hannaneh Hajishirzi

Published in NAACL 2016, 2016

“[O]ur framework provides the possibility for customizing a dataset with regard to considerations such as lexical and template overlap or grammaticality, allowing researchers to choose how many of the difficulties of open domain web-sourced word problem texts they want to tackle”

PDF

Phonological Pun-derstanding

Aaron Jaech, Rik Koncel-Kedziorski, Mari Ostendorf

Published in NAACL 2016, 2016

“From the high culture of Shakespeare’s plays, to the depths of the YouTube comments section, from advertising slogans to conversations with nerdy parents, puns are a versatile rhetorical device and their understanding is essential to any comprehensive approach to computational humor.”

PDF

A Theme-Rewriting Approach for Generating Algebra Word Problems

Rik Koncel-Kedziorski, Ioannis Konstas, Luke Zettlemoyer, Hannaneh Hajishirzi

Published in EMNLP 2016, 2016

“Given a theme, the rewrite algorithm constructs new texts by substituting thematically appropriate words and phrases, as measured with automatic metrics over the theme text collection, for parts of the original texts. This process optimizes for a number of metrics of overall text quality, including syntactic, semantics, and discourse scores. It uses no hand crafted templates and requires no theme-specific tuning data, making it easy to apply for new themes in practice”

PDF

Data-Driven Methods for Solving Algebra Word Problems

Benjamin Robaidek, Rik Koncel-Kedziorski, Hannaneh Hajishirzi

Published in arxiv, 2018

“[H]ow do data-driven approaches to math word problem solving compare to each other? How can data-driven approaches benefit from recent advances in neural representation learning? What are the limits of data-driven solvers?”

PDF

Pyramidal Recurrent Unit for Language Modeling

Sachin Mehta, Rik Koncel-Kedziorski, Mohammad Rastegari, Hannaneh Hajishirzi

Published in EMNLP 2018, 2018

“At the heart of the PRU is the pyramidal transformation (PT), which uses subsampling to effect multiple views of the input vector. The subsampled representations are combined in a pyramidal fusion structure, resulting in richer interactions between the individual dimensions of the input vector than is possible with a linear transformation.”

PDF

Text Generation from Knowledge Graphs with Graph Transformers

Rik Koncel-Kedziorski, Dhanush Bekal, Yi Luan, Mirella Lapata, Hannaneh Hajishirzi

Published in NAACL 2019, 2019

“The main contributions of this work include: 1) We propose a new graph transformer encoder that applies the successful sequence transformer to graph structured inputs. 2) We show how IE output can be formed as a connected unlabeled graph for use in attention-based encoders. 3) We provide a large dataset of knowledgegraphs paired with scientific texts for further study.”

PDF

MathQA: Towards Interpretable Math Word Problem Solving with Operation-Based Formalisms

Aida Amini, Saadia Gabriel, Peter Lin, Rik Koncel-Kedziorski, Yejin Choi, Hannaneh Hajishirzi

Published in NAACL 2019, 2019

“In this paper, we introduce a new operation-based representation language for solving math word problems. We use this representation language to construct MathQA, a new large-scale, diverse dataset of 37k English multiple-choice math word problems covering multiple math domain categories”

PDF

DeFINE: Deep Factorized Input Token Embeddings for Neural Sequence Modeling

Sachin Mehta, Rik Koncel-Kedziorski, Mohammad Rastegari, Hannaneh Hajishirzi

Published in ICLR 2020, 2020

“The embedding layer can thus be thought of as a wide, shallow network consisting of a single linear transformation [that] takes a token from its orthographic form to a representation of those of its morphosyntactic and semantic properties which are relevant for modeling an arbitrary number of contexts in which the token can occur. We hypothesize that … a shallow network would require exceptional capacity to learn a good approximation [but] a deeper network can … with significantly fewer parameters …”

PDF

Citation Text Generation

Kelvin Luu, Rik Koncel-Kedziorski, Kyle Lo, Isabel Cachola, Noah A Smith

Published in Pre-print, 2020

“Automatically describing inter-document relationships could dramatically decrease the time researchers devote to literature review. For instance, a new paper could be explained in terms of its relationships to relevant works that a particular reader is most familiar with, rather than just those which the authors elected to cite”

PDF

A Controllable Model of Grounded Response Generation

Zeqiu Wu, Michel Galley, Chris Brockett, Yizhe Zhang, Xiang Gao, Chris Quirk, Rik Koncel-Kedziorski, Jianfeng Gao, Hannaneh Hajishirzi, Mari Ostendorf, Bill Dolan

Published in Pre-print, 2020

“We posit that both grounding knowledge and lexical control are essential to generating reliable information. We therefore introduce a generation framework called controllable grounded response generation that incorporates both components.”

PDF

Extracting Summary Knowledge Graphs from Long Documents

Zeqiu Wu, Rik Koncel-Kedziorski, Mari Ostendorf, Hannaneh Hajishirzi

Published in Pre-print, 2020

“[Prior] techniques focus on extracting all entities and relations from a document, which for long and dense documents such as scientific papers may be hundreds or thousands. This poses a new challenge: how do we determine the most important entities in a paper and the key relationships between them?”

PDF

Go Forth and Prosper: Language Modeling with Ancient Textual History

Rik Koncel-Kedziorski and Noah A. Smith

Published in Pre-Print, 2021

“We introduce a simple technique for improving language modeling of long documents by effectively extending the LM’s accessible history beyond the architecture-specified context window and into the “ancient history”—text which comes before the beginning of the context window. We train an auxiliary function to select the parts of the ancient history that are most predictive of the future text…”

PDF

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.