i'm trying to find information about the use of tensors in natural language processing. is anyone of you aware of any work regarding tensors and nlp (publications, projects etc)? all input is highly appreciated!

thanks,

f

asked Jan 04 '11 at 03:51

Fredrik%20Olsson's gravatar image

Fredrik Olsson
76226

Searching "tensor" in the ACL anthology is a start: http://www.google.com/cse?cx=011664571474657673452%3A4w9swzkcxiy&cof=FORID%3A0&q=tensor&sa=Search+the+Anthology .

(Jan 04 '11 at 05:09) Alexandre Passos ♦
1

By tensors, do you mean multi-dimensional arrays?

(Jan 12 '11 at 02:01) Yaroslav Bulatov

5 Answers:

I think you should have a look at Peter Turney's works and its technical paper on tensor decomposition and the source code.

answered Jan 05 '11 at 02:01

Guillaume%20Pitel's gravatar image

Guillaume Pitel
176148

I seem to run across two kinds of tensors in literature -- first one is essentially a multi-dimensional array. It can be compactly represented, so for instance, you can view discrete probability distribution over k variables as tensor of rank k, and a graphical model as a compact factorization of this tensor.

A more traditional kind of tensor comes up in differential geometry. This kind of tensor is an object in a particular space, and it can be represented as "multi-dimensional array" in a particular coordinate system. Different coordinate systems give different arrays for the same tensor. This kind of tensor space comes equipped with extra structure, like derivatives.

The closest I've seen to "differential geometry" tensor used in Machine Learning is Peter McCallaugh's "Tensor Methods in Statistics" book. It is freely available

answered Jan 12 '11 at 17:58

Yaroslav%20Bulatov's gravatar image

Yaroslav Bulatov
2333214365

edited Jan 12 '11 at 18:01

One recent paper that comes to mind is Baroni and Lenci (Computational Linguistics, 2010). From the abstract:"...the Distributional Memory framework extracts distributional information once and for all from the corpus, in the form of a set of weighted word-link-word tuples arranged into a third-order tensor. Different matrices are then generated from the tensor, and their rows and columns constitute natural spaces to deal with different semantic problems."

answered Jan 18 '11 at 17:31

Yariv%20Maron's gravatar image

Yariv Maron
17526

I can't comment on NLP, but I can give some decent resources for learning about tensors in general: Introduction To Vectors and Tensors Vol 1, and Vol 2.

The first book spends some time introducing some abstract algebra concepts necessary to introduce tensors, the 2nd book is focused more on the use of tensors & the like instead of the underpinnings.

The only ML algo I've encountered so-far using tensors is Graham Taylor's factored conditional restricted boltzmann machines, but they factored out the 3-way tensor to a series of matrix multiplications to avoid the O(n^3) (or would it be O(n^4)?) complexity that would've come from using the initial tensor they describe.

This answer is marked "community wiki".

answered Jan 05 '11 at 04:00

Brian%20Vandenberg's gravatar image

Brian Vandenberg
824213746

Tim van den Cruys worked with Tensors for NLP (lexical semantics). His PhD thesis can be found here: http://dissertations.ub.rug.nl/FILES/faculties/arts/2010/t.van.de.cruys/14complete.pdf

answered Jan 08 '11 at 08:24

b_a's gravatar image

b_a
16223

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.