FANDOM


This page concerns with logic atoms: r(t1,...,tn) where r is a relation and {ti} are constants. For simplicity, we only consider binary relations: r(x, y).

There are more than one way to formulate the problem: RESCAL (Nickel et al. 2011[1], Nickel et al. 2012[2]) represent all relations (including non-existing relations which are assume to be false) by a binary three-way tensor of size Xk (n×n×m). This tensor is then factorized into ARAT (using appropriate tensor operators) in which A (n×r) represents constants and R (r×r×m) represents relation types. Parameters are estimated using least square loss.

The model seems equivalent to bilinear model (Sutskever et al. 2009[3]; Jenatton et al. 2012[4]) but I'm not sure.

Another model from Riedel et al. (2013)[5] and Rocktäschel et al. (2015)[6] conceives relations as a binary matrix |P|×|R| where P is the set of pairs of constants and R is the set of predicates. Open world assumption is used and the authors optimize log likelihood instead. Negative samples were drawn randomly.

More complicated models which possess more expressive power are possible such as neural-tensor network of Socher et al. (2013)[7].

A systematic classification and comparison between those models is deperately needed.

See also Edit

References Edit

  1. M. Nickel, V. Tresp, and H. Kriegel. A Three-Way model for collective learning on Multi-Relational data. In Proceedings of the 28th International Conference on Machine Learning, ICML ’11, pages 809—816, Bellevue, WA, USA, 2011. ACM.
  2. Nickel, M., Tresp, V., & Kriegel, H.-P. (2012). Factorizing YAGO: scalable machine learning for linked data. In Proceedings of the 21st international conference on World Wide Web (pp. 271–280). New York, NY, USA: ACM. doi:10.1145/2187836.2187874
  3. I. Sutskever, R. Salakhutdinov, and J. B. Tenenbaum. Modelling relational data using Bayesian clustered tensor factorization. In NIPS, 2009.
  4. R. Jenatton, N. Le Roux, A. Bordes, and G. Obozinski. A latent factor model for highly multi-relational data. In NIPS, 2012. [10]
  5. Riedel, S., Yao, L., Marlin, B. M., & McCallum, A. (2013). Relation Extraction with Matrix Factorization and Universal Schemas. In Joint Human Language Technology Conference/Annual Meeting of the North American Chapter of the Association for Computational Linguistics (HLT-NAACL ’13).
  6. Rocktäschel, T., Singh, S., & Riedel, S. (2015). Injecting Logical Background Knowledge into Embeddings for Relation Extraction. In Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 1119–1129). Association for Computational Linguistics.
  7. Socher, R., Chen, D., Manning, C. D., & Ng, A. Y. (2013). Reasoning With Neural Tensor Networks for Knowledge Base Completion. Advances in Neural Information Processing Systems, 926–934.