Skip to content

Latest commit

 

History

History
3 lines (3 loc) · 257 Bytes

File metadata and controls

3 lines (3 loc) · 257 Bytes

Toxic comment classifier

The dataset used for training and evaluation is available on Kaggle: Toxic Comment Classification Challenge. Classified the toxic comments from the dataset and also used Gradio library in python to create interface for the model.