A New Method for Sentence Vector Normalization Using Word2vec
Message:
Abstract:
Word embeddings (WE) have received much attention recently as word to numeric vectors architecture for all text processing approaches and has been a great asset for a large variety of NLP tasks. Most of text processing task tried to convert text components like sentences to numeric matrix to apply their processing algorithms. But the most important problems in all word vector-based text processing approaches are different sentences size and as a result, different dimension of sentences matrices. In this paper, we suggest an efficient but simple statistical method to convert text sentences into equal dimension and normalized matrices Proposed method aims to combines three most efficient methods (averaging based, most likely n-grams, and word’s mover distance) to use their advantages and reduce their constraints. The unique size resulting matrix does not depend on language, Subject and scope of the text and words semantic concepts. Our results demonstrate that normalized matrices capture complementary aspects of most text processing tasks such as coherence evaluation, text summarization, text classification, automatic essay scoring, and question answering.
Article Type:
Research/Original Article
Language:
English
Published:
International Journal Of Nonlinear Analysis And Applications, Volume:10 Issue:2, 2019
Pages:
87 - 96
magiran.com/p2069834  
روش‌های دسترسی به متن این مطلب
اشتراک شخصی
در سایت عضو شوید و هزینه اشتراک یک‌ساله سایت به مبلغ 300,000ريال را پرداخت کنید. همزمان با برقراری دوره اشتراک بسته دانلود 100 مطلب نیز برای شما فعال خواهد شد!
اشتراک سازمانی
به کتابخانه دانشگاه یا محل کار خود پیشنهاد کنید تا اشتراک سازمانی این پایگاه را برای دسترسی همه کاربران به متن مطالب خریداری نمایند!