With the increasing production of content in today's world, it is becoming more and more important to pay attention to the principles of document summarization that preserve the meaning of the original document. Document summaries are used everywhere today. Ability to summarize faster in publishing documents and content can be effective. These include scientific articles and news sites. Providing a summary system for Persian language that can provide a desirable summary such as transforming unstructured, can be used in various aspects. In this research, an abstract summarization method based on recursive neural networks and the architecture of Long short-term memory (LSTM) networks and the seq2seq model along with the attention mechanism is presented. The evaluation results show that the summary method proposed in this study using the seq2seq model along with the attention mechanism improves the measurement criteria. We compared the presented model with three examples of models presented for English language and also a model presented for Persian language. Rouge criterion was used to measure the quality of model results.
- حق عضویت دریافتی صرف حمایت از نشریات عضو و نگهداری، تکمیل و توسعه مگیران میشود.
- پرداخت حق اشتراک و دانلود مقالات اجازه بازنشر آن در سایر رسانههای چاپی و دیجیتال را به کاربر نمیدهد.