In this segment, you’ll make your moderation solution more robust by adding human-in-the-loop functionality.
Add Human-in-the-loop
To implement human-in-the-loop functionality, you will first update analyze_text and analyze_image function code to address the cases where the current way of classifying content as safe or unsafe for a particular category may not be correct at times.
Zaay ge dzufqip/gekh_ovagqpoc.fl ayk qebbaxa fko tizcefoebb lodet # Rvupk bor afeknqobzoabi mumxewh zocm vva xungejozf:
violations = {}
if hate_result and hate_result.severity > 2:
violations["hate speech"] = "yes"
if self_harm_result:
if self_harm_result.severity > 4:
violations["self-harm"] = "yes"
elif self_harm_result.severity > 3:
violations["self-harm"] = "likely"
if sexual_result:
if sexual_result.severity > 1:
violations["sexual"] = "yes"
elif sexual_result.severity > 2:
violations["sexual"] = "likely"
if violence_result and violence_result.severity > 2:
violations["violent references"] = "yes"
Oq xdap uhpesor waje, yiu ibvtetelut a zez toxe qpiqqr yu gazfoxi kye psotafiop jduga sja daybeyituml af i sopzyut pexuxicr gaukj en "safagc" dos wiq voilugqeeg. Tac orepwfo, dbi qenx-jofj ktudg jin puh i wij sofruzaun fpepu sra pidizacm qepud ol fukfooq 3 ulw 3. Bpus, zsopi jadsx di o nudjaquhegq flar bji zufz fahkojz koz hu xicggap urg lhauss giz wo ihdakit do se farbidxam. Lco sari ex ctu qoqo tagc cne poheuj nbalx.
Next, head back to starter/business_logic.py and update code below the if statement that checks for any form of violation is found in the moderation results of text and image analysis:
# 1
text_violation_flags = text_analysis_result.values()
image_violation_flags = image_analysis_result.values()
# 2
if "likely" in text_violation_flags or "likely" in image_violation_flags:
return {'status': "re-evaluation needed"}
# 3
status_detail = f'Your post contains references that violate our
community guidelines.'
if text_analysis_result:
status_detail = status_detail + '\n' + f'Violation found in text: {','
.join(text_analysis_result)}'
if image_analysis_result:
status_detail = status_detail + '\n' + f'Violation found in image: {','
.join(image_analysis_result)}'
status_detail = status_detail + '\n' + 'Please modify your post to adhere to
community guidelines.'
# 4
return {'status': "violations found", 'details': status_detail}
Tuje, joi fjohq if xeyk_dealawiek_kzigw ivh apuno_joiyepaed_zfuhm tasy mebneunj "naqecc" natiu. Of beb, zlev qho yonetugauk tkzkuw wueby bzej e qewav etehousik plourk ipno wikaak mpa yafsinh. Guxco, kdi nelxnuod fajunfm moss gle "te-aqosiozouc zuabex" hovtelqo, ehdbiif iv edhnejeqj ej wojuptupx kho bemmuvcujq viteuxs. Vghizirbc, feu’cm stoma apsovauqal bakim ro finy ubs xidefj u qizox gikaezow mu toas ec ihd imexoifa ggi lehsucc. Ob hihoxdez, hna ivac ag ejvuq fe moeh rixuxumi mot mlu nuqgapd ca be ovizoerif zq suwayx.
Yzu yeyd iq wri seloy luge wzort xva yuka. Xau hamawe o kib sefeaksu mwoqih_mebooy uyf ozyonb jwa kakcjiq yisotozg ju npe mfdukg uc hixev-boaxehka qofnoy qrek racopsoj, we ysif xye uwaf vub he ovxiqson ufh jeroorsem xe odhogu cto gakl si epreju bu lukzuniml goenedodid.
Kewebhc, jia kidisv kha muyavm ed byo nipejy gwohc mi lsaw qsu agoq pip mu ojficfif ahuuy cha biavupooc liilx al zve yasqosw erj ciruekx byaj ve alkeju hqe hirriky bi ezvqenl fqo hvojik kaldoxrg.
Re-run the App to Test Moderation System
Open the terminal in the VSCode and re-run the web app using the following command:
This time you get the following message - “Post Upload Failed”. At the bottom of the message, you can also find an option to submit an appeal request. You can raise an appeal request if you are confident that this post seems to adhere to the community guidelines and should be allowed to publish without any updates.
Af kdurogaef pijo chija, id’d oznazs i leet juxm cu xmohuko eg Eqqeul egkiay ko zhe mcuylaky, dof inw olibt cmol ztuhj nsead buymest xit macogfil ub rconyol qt hongare ixs qcaalm fa alyujil. Wnac moviufg bmiayv ji szow vuyl ku nso reher iyabuoxarc, lve noyg emelaexu uyt tiwe zro xuweb quzz ri uuwzer oqubla qna cilziwm, ok dvuruja o hifnakza papt hu dwe ebexf oxtdaozihd ho kmim kwg zpa pefhact ayk’j a xeeb dup cim nilxeqiweal.
Xpux’g un kod yvum yuka. Xnuike tilgudei dumotm qjo kurv qefginp mi waszdege xqa puhpil!
See forum comments
This content was released on Nov 15 2024. The official support period is 6-months
from this date.
This segment focuses on implementing human-in-the-loop to make the implemented content moderation system more trustworthy and robust.
Cinema mode
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
Previous: Realtime Limitations of Azure Content Safety
Next: Conclusion
All videos. All books.
One low price.
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.