Feature Request
Would is be possible to support python scalars?
eg.
num_float = ep.astensor(1.0)
num_int = ep.astensor(42)
Why?
This would be helpful for implementing generic functions, for example:
def get_kernel_size(sigma=1.0, trunc=4.0):
sigma, trunc = ep.astensors(sigma, trunc)
radius = (sigma * trunc + 0.5).astype(int)
return (2 * radius + 1).raw
This could be called with default values as well as tensors.
t_sigma = torch.abs(torch.randn(10))
n_trunc = np.abs(np.random.randn(10))
get_kernel_size(sigma=1.0, trunc=4.0)
get_kernel_size(sigma=t_sigma, trunc=4.0)
get_kernel_size(sigma=1.0, trunc=n_trunc)
Feature Request
Would is be possible to support python scalars?
eg.
Why?
This would be helpful for implementing generic functions, for example:
This could be called with default values as well as tensors.