Loss-Balanced Task Weighting to Reduce Negative Transfer in Multi-Task Learning
2019 ◽
Vol 33
◽
pp. 9977-9978
◽
Keyword(s):
In settings with related prediction tasks, integrated multi-task learning models can often improve performance relative to independent single-task models. However, even when the average task performance improves, individual tasks may experience negative transfer in which the multi-task model’s predictions are worse than the single-task model’s. We show the prevalence of negative transfer in a computational chemistry case study with 128 tasks and introduce a framework that provides a foundation for reducing negative transfer in multitask models. Our Loss-Balanced Task Weighting approach dynamically updates task weights during model training to control the influence of individual tasks.
2020 ◽
Vol 34
(05)
◽
pp. 8936-8943
Keyword(s):
2019 ◽
Vol 62
(7)
◽
pp. 2099-2117
◽
Keyword(s):
2012 ◽
Vol 25
(3)
◽
pp. 1-26
2021 ◽
Vol 768
◽
pp. 145489
Keyword(s):
Keyword(s):