Double R Grammar
Double R Process

Other Papers

ACT-R Papers

Explorations in ACT-R Based Language Analysis – Memory Chunk Activation, Retrieval and Verification without Inhibition (2012, 6 pages, ICCM conference): This paper explores the benefits and challenges of using the ACT-R cognitive architecture in the development of a large scale, functional, cognitively motivated language analysis model. The paper focuses on ACT-R’s declarative memory retrieval mechanism, proposing extensions to support verification of retrieved chunks, multi-level activation spread and carry over activation. The paper argues against the need for inhibition between competing chunks which is necessarily task specific.

Explorations in ACT-R Based Cognitive Modeling – Chunks, Inheritance, Production Matching and Memory in Language Analysis (2011, 6 pages, Advances in Cognitive Systems): This paper explores the benefits and challenges of using the ACT-R cognitive architecture in the development of a large scale, functional, cognitively motivated language analysis model. The paper focuses on chunks, inheritance, production matching and memory, proposing extensions to ACT-R to support multiple inheritance and suggesting a mapping from the focus of attention, working memory and long-term memory to ACT-R buffers and declarative memory (DM).

Advantages of ACT-R over Prolog for Natural Language Analysis (unpublished, 7 pages): This paper discusses the advantages of using the ACT-R cognitive architecture over the Prolog programming language for the research and development of a large scale, functional, cognitively motivated model of language analysis. Although Prolog was developed for Natural Language Processing (NLP), it lacks any probabilistic mechanisms for dealing with ambiguity and relies on failure detection and algorithmic backtracking to explore alternative analyses. These mechanisms are problematic for handling ill-formed or unexpected inputs, often resulting in an exploration of the entire search space, which becomes intractable as the complexity and variability of the allowed inputs and corresponding grammar grow. By comparison, ACT-R provides context dependent and probabilistic mechanisms which allow the model to incrementally pursue the best analysis. When combined with a non-monotonic context accommodation mechanism that supports modest adjustment of the evolving analysis to handle cases where the locally best analysis is not globally preferred, the result is an efficient pseudo-deterministic mechanism that obviates the need for failure detection and backtracking, aligns with our basic understanding of Human Language Processing (HLP) and is scalable to broad coverage. The transition of the language analysis model from Prolog to ACT-R has supported expansion of the capabilities of the model well beyond the Prolog predecessor and suggests that a cognitively motivated approach to language analysis may also be suitable for achieving a functional capability.

AI Papers

Simplifying the Mapping from Referring Expression to Referent in a Conceptual Semantics of Reference (2010, 6 pages, Cog Sci conference) : In Jackendoff’s Conceptual Semantics, reference to objects, situations, places, directions, times, manners, and measures is supported, but reference is limited to instances of these conceptual categories. This paper proposes an extension of Jackendoff’s referential types along an orthogonal dimension of reference which is cognitively motivated in suggesting the possibility of referring to types, prototypes and exemplars in addition to instances, as well as classes and collections of all referential types and vacuous instances and collections. The paper also introduces a bi-partite distinction between a situation model and the mental universe which helps to explain apparent non-referential uses of referring expressions. The primary motivation for expanding the ontology of referential types and distinguishing the situation model from the mental universe is to simplify the mapping from linguistic expressions to corresponding representations of referential meaning. The viability of this approach hinges on adoption of the mentalist semantics of Jackendoff. There is no direct reference to actual objects in the external world.
Keywords: referring expression; Conceptual Semantics
A Naturalistic, Functional Approach to Modeling Language Comprehension (2008, 8 pages, Naturalistic Approaches to AI) : A naturalistic approach to the development of a large-scale, functional, cognitively plausible model of language comprehension that contrasts with mainstream cognitive modeling and computational linguistic research
Can NLP Systems be a Cognitive Black Box? (2006, 6 pages) : An examination of whether or not NLP systems can be developed without considering how humans process language
Symposium Topic: Software Agents with Natural Language Capabilities -- Where are we? (2004, 4 pages, BRIMS conference) : Symposium Topic at the Behavior Representation in Modeling and Simulation (BRIMS) conference
A Consideration of Prolog (1989, 17 pages) : a largely empirical consideration of the utility of Prolog for the development of AI systems
The Future of Expert System Development Tools (1993, 10 pages) : a paper on the evaluation and selection of an Expert System Development Tool for use in two AI projects

Linguistic Papers

Towards a Semantics of X-Bar Theory (2003, 6 pages, unpublished): a proposed semantics for X-Bar Theory based on Double R Grammar
Specifiers, Not Heads, Determine Phrase and Clause Type (2003, 9 pages, unpublished): a consideration of the referential function of specifiers with respect to the determination of phrase and clause type

Cognitive Modeling Papers

Comparing Three Variants of a Computational Process Model of Basic Aircraft Maneuvergin (2003, 12 pages) : paper presented at the Behavior Representation in Modeling and Simulation (BRIMS) conference
A Computational Process Model of Basic Aircraft Maneuvering (2003, 6 pages): paper presented at the Fifth International Conference on Cognitive Modeling
DISTRIBUTION A: Approved for public release; distribution unlimited.