- ott.tools.soft_sort.quantize(inputs, num_levels=10, axis=- 1, **kwargs)#
Soft quantizes an input according using num_levels values along axis.
The quantization operator consists in concentrating several values around a few predefined
num_levels. The soft quantization operator proposed here does so by carrying out a soft concentration that is differentiable. The
inputsvalues are first soft-sorted, resulting in
num_levelsvalues. In a second step, the
inputsvalues are then provided again a composite value that is equal (for each) to a convex combination of those soft-sorted values using the transportation matrix. As the regularization parameter
epsilonof regularized optimal transport goes to 0, this operator recovers the expected behavior of quantization, namely each value in
inputsis assigned a single level. When using
epsilon>0the bheaviour is similar but differentiable.
ndarray) – the inputs as a jnp.ndarray[batch, dim].
int) – number of levels available to quantize the signal.
int) – axis along which quantization is carried out.
Any) – keyword arguments passed on to lower level functions. Of interest to the user are
squashing_fun, which will redistribute the values in
inputsto lie in [0,1] (sigmoid of whitened values by default) to solve the optimal transport problem;
cost_fn, used in
PointCloud, that defines the ground cost function to transport from
num_targetstarget values (squared Euclidean distance by default, see
pointcloud.pyfor more details);
epsilonvalues as well as other parameters to shape the
- Return type
A jnp.ndarray of the same size as