您的位置 首页 > 腾讯云社区

多标签分类中的监督最小化(CS CV)---刘持诚

大多数图像中都存在多类对象。将其视为多类分类是不合理的。我们应该将其作为多标签分类问题来处理。在本文中,我们进一步的目标是将多标签分类中提供监督所需的监督最小化。具体来说,我们研究了一类有效的方法,这些方法将弱定位与每个类别的边界框或分割掩码关联起来。这样做可以提高多标签分类的准确性。我们采用的方法是一种主动学习的方法,即根据当前的模型逐步选择一组需要监督的样本,对这些样本进行监督,用额外的监督样本集重新训练模型,然后再次选择下一组样本。一个关键的问题是如何选择样本集。在这样做时,我们提供了一个新的想法,没有一种特定的措施能够成功地获得一个持续改进的选择标准。因此,我们提供了一个选择标准,通过对不同标准的前k个样本集的选择,我们提供了一个选择标准,通过选择前 k 个样本集来持续改进整体基线标准。使用这个标准,我们能够表明,我们只需要使用 20% 的样本(使用 10%的样本,我们可以保留超过96%的性能)就可以保留超过98%的完全监督的性能,在 PASCAL VOC 2007 和 2012 年的数据集上。此外,我们提出的方法在所有基准数据集和模型组合上,都一致优于所有其他基准指标。

原文题目:Authentication and Key Management Automation in Decentralized Secure Email and Messaging via Low-Entropy Secrets

原文:Multiple categories of objects are present in most images. Treating this as a multi-class classification is not justified. We treat this as a multi-label classification problem. In this paper, we further aim to minimize the supervision required for providing supervision in multi-label classification. Specifically, we investigate an effective class of approaches that associate a weak localization with each category either in terms of the bounding box or segmentation mask. Doing so improves the accuracy of multi-label categorization. The approach we adopt is one of active learning, i.e., incrementally selecting a set of samples that need supervision based on the current model, obtaining supervision for these samples, retraining the model with the additional set of supervised samples and proceeding again to select the next set of samples. A crucial concern is the choice of the set of samples. In doing so, we provide a novel insight, and no specific measure succeeds in obtaining a consistently improved selection criterion. We, therefore, provide a selection criterion that consistently improves the overall baseline criterion by choosing the top k set of samples for a varied set of criteria. Using this criterion, we are able to show that we can retain more than 98% of the fully supervised performance with just 20% of samples (and more than 96% using 10%) of the dataset on PASCAL VOC 2007 and 2012. Also, our proposed approach consistently outperforms all other baseline metrics for all benchmark datasets and model combinations.

原文作者:Rajat, Munender Varshney, Pravendra Singh, Vinay P. Namboodi

原文地址:https://arxiv.org/abs/2005.12892

多标签分类中的监督最小化(CS CV).pdf ---来自腾讯云社区的---刘持诚

关于作者: 瞎采新闻

这里可以显示个人介绍!这里可以显示个人介绍!

热门文章

留言与评论(共有 0 条评论)
   
验证码: