摘要:In the decade since 2010 successes in artificial intelligence have been at the forefront of computer science and technology and vector space models have solidified a position at the forefront of artificial intelligence. At the same time quantum computers have become much more powerful and announcements of major advances are frequently in the news. The mathematical techniques underlying both these areas have more in common than is sometimes realized. Vector spaces took a position at the axiomatic heart of quantum mechanics in the 1930s and this adoption was a key motivation for the derivation of logic and probability from the linear geometry of vector spaces. Quantum interactions between particles are modelled using the tensor product which is also used to express objects and operations in artificial neural networks. This paper describes some of these common mathematical areas including examples of how they are used in artificial intelligence (AI) particularly in automated reasoning and natural language processing (NLP). Techniques discussed include vector spaces scalar products subspaces and implication orthogonal projection and negation dual vectors density matrices positive operators and tensor products. Application areas include information retrieval categorization and implication modelling word-senses and disambiguation inference in knowledge bases and semantic composition. Some of these approaches can potentially be implemented on quantum hardware. Many of the practical steps in this implementation are in early stages and some are already realized. Explaining some of the common mathematical tools can help researchers in both AI and quantum computing further exploit these overlaps recognizing and exploring new directions along the way.
关键词:mathematical foundations;information retrieval;natural language