K3 Addons supercharge your multibackend Keras 3 workflow, giving access to various innovative machine learning techniques. While Keras 3 offers a rich set of APIs, not everything can be included in the core APIs due to less generic usage. K3 Addons bridges this gap, ensuring you're not limited by the core Keras 3 library. These add-ons might include various attention mechanisms for Text and Image Data, advanced optimizers, or specialized layers tailored for unique data types. With K3 Addons, you'll gain the flexibility to tackle emerging ML challenges and push the boundaries of what's possible with Keras 3.
To Install K3 Addons simply run following command in your environment:
pip install k3-addonsCurrently includes layers, losses, and activations API.
-
-
k3_addons.layers.AdaptiveAveragePooling1DMultibackend Implementation of
torch.nn.AdaptiveAvgPool1d. The results are close to PyTorch. -
k3_addons.layers.AdaptiveMaxPooling1DMultibackend Implementation of
torch.nn.AdaptiveMaxPool1d. The results are close to PyTorch. -
k3_addons.layers.AdaptiveAveragePooling2DMultibackend Implementation of
torch.nn.AdaptiveAvgPool2d. The results are close to PyTorch. -
k3_addons.layers.AdaptiveMaxPooling2DMultibackend Implementation of
torch.nn.AdaptiveMaxPool2d. The results are close to PyTorch. -
k3_addons.layers.MaxoutMultibackend port of
tensorflow_addons.layers.Maxout. Paper
-
-
-
k3_addons.layers.InstanceNormalizationSpecific case of
keras.layers.GroupNormalizationsince it normalizes all features of one channel. The Groupsize is equal to the channel size.
-
-
k3_addons.layers.DoubleAttention
k3_addons.layers.AFTFull
k3_addons.layers.ChannelAttention2Dk3_addons.layers.SpatialAttention2Dk3_addons.layers.ECAAttention
ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks
k3_addons.layers.ExternalAttention
Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks
k3_addons.layers.ResidualAttention
Residual Attention: A Simple but Effective Method for Multi-Label Recognition
k3_addons.layers.MobileViTAttention
Coordinate Attention for Efficient Mobile Network Design
k3_addons.layers.BAMBlockk3_addons.layers.CBAMk3_addons.layers.MobileViTv2Attention
Separable Self-attention for Mobile Vision Transformers
k3_addons.layers.ParNetAttention
k3_addons.layers.SimAM
-
k3_addons.losses.ContrastiveLossk3_addons.losses.GIoULossk3_addons.losses.PinballLossk3_addons.losses.SigmoidFocalCrossEntropyk3_addons.losses.WeightedKappaLossk3_addons.losses.pairwise_distancek3_addons.losses.pinball_loss
-
k3_addons.activations.hardshrinkk3_addons.activations.lishtk3_addons.activations.mishk3_addons.activations.snakek3_addons.activations.tanhshrink
