このアイテムのアクセス数: 53

このアイテムのファイル:
ファイル 記述 サイズフォーマット 
v1_2023.findings-eacl.195.pdf287.22 kBAdobe PDF見る/開く
完全メタデータレコード
DCフィールド言語
dc.contributor.authorWan, Zhenen
dc.contributor.authorCheng, Feien
dc.contributor.authorLiu, Qianyingen
dc.contributor.authorMao, Zhuoyuanen
dc.contributor.authorSong, Haiyueen
dc.contributor.authorKurohashi, Sadaoen
dc.contributor.alternative万, 振ja
dc.contributor.alternative程, 飛ja
dc.contributor.alternative劉, 倩瑩ja
dc.contributor.alternative毛, 卓遠ja
dc.contributor.alternative宋, 海越ja
dc.contributor.alternative黒橋, 禎夫ja
dc.date.accessioned2024-02-08T04:34:16Z-
dc.date.available2024-02-08T04:34:16Z-
dc.date.issued2023-05-
dc.identifier.urihttp://hdl.handle.net/2433/286922-
dc.description17th Conference of the European Chapter of the Association for Computational Linguistics, May 2-6, 2023en
dc.description.abstractContrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines. Our code and models are available at: https://github.com/YukinoWan/WCL.en
dc.language.isoeng-
dc.publisherAssociation for Computational Linguisticsen
dc.rights©2023 Association for Computational Linguisticsen
dc.rightsACL materials are Copyright © 1963–2024 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License.en
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/-
dc.titleRelation Extraction with Weighted Contrastive Pre-training on Distant Supervisionen
dc.typeconference paper-
dc.type.niitypeConference Paper-
dc.identifier.jtitleFindings of the Association for Computational Linguistics: EACL 2023en
dc.identifier.spage2580-
dc.identifier.epage2585-
dc.relation.doi10.18653/v1/2023.findings-eacl.195-
dc.textversionpublisher-
dcterms.accessRightsopen access-
datacite.awardNumber22J13719-
datacite.awardNumber21J23124-
datacite.awardNumber21H00308-
datacite.awardNumber.urihttps://kaken.nii.ac.jp/grant/KAKENHI-PROJECT-22KJ1843/-
datacite.awardNumber.urihttps://kaken.nii.ac.jp/grant/KAKENHI-PROJECT-22KJ1724/-
datacite.awardNumber.urihttps://kaken.nii.ac.jp/grant/KAKENHI-PUBLICLY-21H00308/-
jpcoar.funderName日本学術振興会ja
jpcoar.funderName日本学術振興会ja
jpcoar.funderName日本学術振興会ja
jpcoar.awardTitle事前学習と多言語意味表現学習を統合した低資源機械翻訳ja
jpcoar.awardTitle多言語コーパス構築とドメイン適応による低資源機械翻訳ja
jpcoar.awardTitleTemporal knowledge supervision for pre-training tranfer learning modelsen
出現コレクション:学術雑誌掲載論文等

アイテムの簡略レコードを表示する

Export to RefWorks


出力フォーマット 


このアイテムは次のライセンスが設定されています: クリエイティブ・コモンズ・ライセンス Creative Commons