@e.fischer

BERT Goes Shopping: Comparing Distributional Models for Product Representations

, , and . (2020)cite arxiv:2012.09807Comment: Updated version. Published as a workshop paper at ECNLP 4 at ACL-IJCNLP 2021.

Abstract

Word embeddings (e.g., word2vec) have been applied successfully to eCommerce products through~prod2vec. Inspired by the recent performance improvements on several NLP tasks brought by contextualized embeddings, we propose to transfer BERT-like architectures to eCommerce: our model -- ~Prod2BERT -- is trained to generate representations of products through masked session modeling. Through extensive experiments over multiple shops, different tasks, and a range of design choices, we systematically compare the accuracy of~Prod2BERT and~prod2vec embeddings: while~Prod2BERT is found to be superior in several scenarios, we highlight the importance of resources and hyperparameters in the best performing models. Finally, we provide guidelines to practitioners for training embeddings under a variety of computational and data constraints.

Description

BERT Goes Shopping: Comparing Distributional Models for Product Representations

Links and resources

Tags

community

  • @e.fischer
  • @dblp
@e.fischer's tags highlighted