simCLR contrastive loss 实现 - 知乎专栏

文章推薦指數: 80 %
投票人數:10人

simCLR: contrastive loss 计算样本之前的similarity: 计算NT-Xent loss: import tensorflow as tf import numpy as np def contrastive_loss(out ... 首发于好好学习,好好生活无障碍写文章登录/注册simCLR:contrastiveloss计算样本之前的similarity:计算NT-Xentloss:importtensorflowastf importnumpyasnp defcontrastive_loss(out,out_aug,batch_size=128,hidden_norm=False,temperature=1.0): ifhidden_norm: out=tf.nn.l2_normalize(out,-1) out_aug=tf.nn.l2_normalize(out_aug,-1) INF=np.inf labels=tf.one_hot(tf.range(batch_size),batch_size*2)#[batch_size,2*batch_size] masks=tf.one_hot(tf.range(batch_size),batch_size)#[batch_size,batch_size] logits_aa=tf.matmul(out,out,transpose_b=True)/temperature#[batch_size,batch_size] logits_bb=tf.matmul(out_aug,out_aug,transpose_b=True)/temperature#[batch_size,batch_size] logits_aa=logits_aa-masks*INF#removethesamesamplesinout logits_bb=logits_bb-masks*INF#removethesamesamplesinout_aug logits_ab=tf.matmul(out,out_aug,transpose_b=True)/temperature logits_ba=tf.matmul(out_aug,out,transpose_b=True)/temperature loss_a=tf.losses.softmax_cross_entropy( labels,tf.concat([logits_ab,logits_aa],1)) loss_b=tf.losses.softmax_cross_entropy( labels,tf.concat([logits_ba,logits_bb],1)) loss=loss_a+loss_b returnloss,logits_ab ''' 假设batch_size=3,out和out_aug分别代码原始数据和增强数据的representation out:[a1,a2,a3] out_aug:[b1,b2,b3] labels: [batch_size,2*batch_size]batch_size=3 100000 010000 001000 mask: [batch_size,batch_size] 100 010 001 logits_aa[batch_size,batch_size] a1*a1,a1*a2,a1*a3 a2*a1,a2*a2,a2*a3 a3*a1,a3*a2,a3*a3 logits_bb[batch_size,batch_size] b1*b1,b1*b2,b1*b3 b2*b1,b2*b2,b2*b3 b3*b1,b3*b2,b3*b3 logits_aa-INF*mask#deletesamesamples -INF,a1*a2,a1*a3 a2*a1,-INF,a2*a3 a3*a1,a3*a2,-INF logits_bb-INF*mask#deletesamesamples -INF,b1*b2,b1*b3 b2*b1,-INF,b2*b3 b3*b1,b3*b2,-INF logits_ab[batch_size,batch_size] a1*b1,a1*b2,a1*b3 a2*b1,a2*b2,a2*b3 a3*b1,a3*b2,a3*b3 logtis_ba[batch_size,batch_size] b1*a1,b1*a2,b1*a3 b2*a1,b2*a2,b2*a3 b3*a1,b3*a2,b3*a3 concat[logits_ab,logits_aa]: a1*b1,a1*b2,a1*b3,-INF,a1*a2,a1*a3 a2*b1,a2*b2,a2*b3,a2*a1,-INF,a2*a3 a3*b1,a3*b2,a3*b3,a3*a1,a3*a2,-INF onlya1*b1,a2*b2,a3*b3arepositives concat[logits_ab,logits_bb]: b1*a1,b1*a2,b1*a3,-INF,b1*b2,b1*b3 b2*a1,b2*a2,b2*a3,b2*b1,-INF,b2*b3 b3*a1,b3*a2,b3*a3,b3*b1,b3*b2,-INF onlyb1*a1,b2*a2,b3*a3arepositives,socalculatethesoftmax_cross_entropywithlabels '''发布于2020-04-1511:38机器学习神经网络深度学习(DeepLearning)​赞同14​​4条评论​分享​喜欢​收藏​申请转载​文章被以下专栏收录好好学习,好好生活编程,语音,音频,深度学习



請為這篇文章評分?