Skip to content

Latest commit

 

History

History
4 lines (4 loc) · 281 Bytes

README.md

File metadata and controls

4 lines (4 loc) · 281 Bytes

Toxicity Detection Model

Unfortunately, toxicity is a common occurrence in online activities. This model is designed to counteract the negative effects of toxicity by detect toxicity and assigning it a score. By utilizing Tensorflow and Keras, we are able to achieve this goal.