标签:can pre learn The tun def lstm better rop
We often come across ‘ablation study‘ in machine learning papers, for example, in this paper with the original R-CNN, it has a section of ablation studies. But what does this means?
Well, we know that when we build a model, we usually have different components of the model. If we remove some component of the model, what‘s the effect on the model? This is a very coarse definition of ablation study - we want to see the contributions of some proposed components in the model by comparing the model including this component with that without this component.
In the above paper, in order to see the effect of fine-tuning of the CNN, the authors analyzed the performance of the model with the fine-tuning and the performance of it without the fine-tuning. This way, we can easily see the effect of the fine-tuning.
The following I copied from the answer of Jonathan Uesato on Quora, it explains very well:
标签:can pre learn The tun def lstm better rop
原文地址:https://www.cnblogs.com/sddai/p/11172050.html