I am re-produce a paper https://arxiv.org/abs/1711.11575 : where it has one formula:But I searched chainer, it only has F.softmax,but it cannot add weight on to it.How to reimplement that formula?...Read more

How could I scale gradients where the loss comes from sparse_softmax_cross_entropy_with_logits. For example, I was trying to divide by 128 as below, but I found error:new_gradients = [(grad/128, var) for (grad, var) in gradients] TypeError: unsupported operand type(s) for /: 'IndexedSlices' and 'int'The code I was using is below:loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits, labels=labels)gradients = opt.compute_gradient(loss)new_gradients = [(grad/128, var) for (grad, var) in gradients]train_step = opt.appy_gradients(new_...Read more

As mentioned here, cross entropy is not a proper loss function for multi-label classification. My question is "is this fact true for cross entropy with softmax too?". If it is, how it can be matched with this part of the document. I should mention that the scope of my question is in cntk....Read more

As I understand, when using softmax of K values in RBM visible units, the hidden unit stays binary.If so - I'm not sure how to compute contributions by the binary units to the visible ones. Am I supposed to relate the binary 0 state in a hidden unit to a specific state out of the K states of the softmax, and the 1 state to the other K-1 states? Or maybe a 0 in the hidden unit correlates to 0 in all of the K possible states of the visible unit (but doesn't it contradict the fact that at least one of the K states must be on?)....Read more

I'm using Turi Create to generate a Core ML image classifier like this:import turicreate as tcdata = tc.SFrame('data.sframe')model = tc.image_classifier.create(data, target='label')model.export_coreml('classifier.mlmodel')However, the model gives me softmax confidence values and in my application, I need pre-softmax values (I'll calculate softmax in the app). Is there a way to turn off the softmax layer?I've had models like this supplied to me in the past, but now I need to generate my own. I've read through the docs and looked through source ...Read more

I have a neural network with a softmax at the end.Something like this:def forward(self, x) x = self.conv(x) x = self.channel_transform_layer(x) output = self.softmax(x) return outputI would like the maximum value of the logits to be p (p being between 0 and 1, say 0.7). I'm working on a task where an output greater than p does not make sense, so I want to constrain all the logits to be between 0 and p.taking a concrete example with pytorch: import torchsoftmax = torch.nn.functional.softmaxsoftmax(torch.Tensor([1,1,5])) # => tenso...Read more