Evaluate User Comment Toxicity Using The Analyze Toxicity With Perspective API Firebase Extension

preview_player
Показать описание
Online toxicity can be a hard thing to manage inside your apps, but it's not impossible. The Perspective API uses machine learning to identify any comments that could be labeled toxic by users. With the Analyze Comment Toxicity with Perspective API Firebase Extension, you can determine if user comments are safe or harmful.

Let me know in the comments what you think about this extension.

CHAPTERS
0:15 - Intro
1:22 - Get Access To Perspective API
1:44 - Enabling The API
3:08 - Install The Firebase Extension
5:39 - Create Data In Firestore
5:59 - Build The Flutter App
9:49 - Outro

Source Code For This Example Here : 👇

You Can View The Extension Here : 👇

Learn More About The Perspective API : 👇

Learn How To Create A Text Translation App In 15 Minutes Using the Translate Text Firebase Extension : 👇

View Apps That I Have Developed In Flutter Here : 👇

Connect With Me:

Thank you for watching!

#treycodes
#flutter
#firebase
Рекомендации по теме
Комментарии
Автор

My guy is never misses a video no matter what. Really interesting video.

adnanhabib
Автор

Might have to add this in. What kind of bills does this generate? More than a penny?

ExtraServingsBTS
Автор

How can you make it check for profanity before it gets stored in the DB?

Daniel-wmpk