このアイテムのアクセス数: 53
このアイテムのファイル:
ファイル | 記述 | サイズ | フォーマット | |
---|---|---|---|---|
v1_2023.findings-eacl.195.pdf | 287.22 kB | Adobe PDF | 見る/開く |
完全メタデータレコード
DCフィールド | 値 | 言語 |
---|---|---|
dc.contributor.author | Wan, Zhen | en |
dc.contributor.author | Cheng, Fei | en |
dc.contributor.author | Liu, Qianying | en |
dc.contributor.author | Mao, Zhuoyuan | en |
dc.contributor.author | Song, Haiyue | en |
dc.contributor.author | Kurohashi, Sadao | en |
dc.contributor.alternative | 万, 振 | ja |
dc.contributor.alternative | 程, 飛 | ja |
dc.contributor.alternative | 劉, 倩瑩 | ja |
dc.contributor.alternative | 毛, 卓遠 | ja |
dc.contributor.alternative | 宋, 海越 | ja |
dc.contributor.alternative | 黒橋, 禎夫 | ja |
dc.date.accessioned | 2024-02-08T04:34:16Z | - |
dc.date.available | 2024-02-08T04:34:16Z | - |
dc.date.issued | 2023-05 | - |
dc.identifier.uri | http://hdl.handle.net/2433/286922 | - |
dc.description | 17th Conference of the European Chapter of the Association for Computational Linguistics, May 2-6, 2023 | en |
dc.description.abstract | Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines. Our code and models are available at: https://github.com/YukinoWan/WCL. | en |
dc.language.iso | eng | - |
dc.publisher | Association for Computational Linguistics | en |
dc.rights | ©2023 Association for Computational Linguistics | en |
dc.rights | ACL materials are Copyright © 1963–2024 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License. | en |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | - |
dc.title | Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision | en |
dc.type | conference paper | - |
dc.type.niitype | Conference Paper | - |
dc.identifier.jtitle | Findings of the Association for Computational Linguistics: EACL 2023 | en |
dc.identifier.spage | 2580 | - |
dc.identifier.epage | 2585 | - |
dc.relation.doi | 10.18653/v1/2023.findings-eacl.195 | - |
dc.textversion | publisher | - |
dcterms.accessRights | open access | - |
datacite.awardNumber | 22J13719 | - |
datacite.awardNumber | 21J23124 | - |
datacite.awardNumber | 21H00308 | - |
datacite.awardNumber.uri | https://kaken.nii.ac.jp/grant/KAKENHI-PROJECT-22KJ1843/ | - |
datacite.awardNumber.uri | https://kaken.nii.ac.jp/grant/KAKENHI-PROJECT-22KJ1724/ | - |
datacite.awardNumber.uri | https://kaken.nii.ac.jp/grant/KAKENHI-PUBLICLY-21H00308/ | - |
jpcoar.funderName | 日本学術振興会 | ja |
jpcoar.funderName | 日本学術振興会 | ja |
jpcoar.funderName | 日本学術振興会 | ja |
jpcoar.awardTitle | 事前学習と多言語意味表現学習を統合した低資源機械翻訳 | ja |
jpcoar.awardTitle | 多言語コーパス構築とドメイン適応による低資源機械翻訳 | ja |
jpcoar.awardTitle | Temporal knowledge supervision for pre-training tranfer learning models | en |
出現コレクション: | 学術雑誌掲載論文等 |

このアイテムは次のライセンスが設定されています: クリエイティブ・コモンズ・ライセンス