NLPIR/ICTCLA2018 ACADEMIC SEMINAR 1st ISSUE

NLPIR/ICTCLA2018 ACADEMIC SEMINAR 1st ISSUE自然语言处理与信息检索共享平台*YUs7v^6p3Xo#@0S U

        INTRO自然语言处理与信息检索共享平台K&oPGYB/`
C9\*F”t

        In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Wednesdays, and each time a keynote speaker will share understanding of papers published in recent years with you.自然语言处理与信息检索共享平台Uq~%gF
\)G
A$p#B-A

 自然语言处理与信息检索共享平台,vi#e5q:H

        This week’s seminar is organized as follows:

;r6g4|5f4I-d2K*k0

        1. The seminar time is 1pm tomorrow, at the center building 1013自然语言处理与信息检索共享平台Y$p$kTrp(\[

        2. The lecturer is Zhang Xi, the paper’s title is A Neural Attention Model for Abstractive Sentence Summarization

-qfuJ{}G EO0

        3. Attachment is the paper of this seminar, please download in advance自然语言处理与信息检索共享平台*] MH5O*C0[p Ia

 

$T”P
\#?x p”c3K1l0

Abstract自然语言处理与信息检索共享平台/uD8L-tCu#h

        Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.自然语言处理与信息检索共享平台Qe VijR A/D+^r5U+c

 

)sS’I)n”C5r[B0

You May Also Like

About the Author: nlpir

发表回复