An yaudari tsarin CSAM na Apple, amma kamfanin yana da kariya guda biyu

Sabuntawa: Apple ya ambaci dubawa na biyu na uwar garken, kuma ƙwararrun kamfanin hangen nesa na kwamfuta ya bayyana yiwuwar abin da za a iya kwatanta wannan a cikin "Yadda dubawa na biyu zai iya aiki" a ƙasa.
Bayan masu haɓakawa sun juyar da ɓangarorin injiniyoyi na sa, farkon sigar Apple CSAM an yaudare shi da kyau don yiwa hoto mara laifi.Duk da haka, Apple ya bayyana cewa yana da ƙarin kariya don hana wannan daga faruwa a rayuwa ta ainihi.
Sabon ci gaba ya faru ne bayan da aka buga NeuralHash algorithm zuwa gidan yanar gizon bude tushen GitHub, kowa na iya gwaji da shi…
Duk tsarin CSAM suna aiki ta hanyar shigo da bayanan sanannun kayan cin zarafin yara daga kungiyoyi irin su Cibiyar Kula da Yara da Bacewar Yara (NCMEC).Ana samar da ma'ajin bayanai ta hanyar hashes ko na dijital daga hotuna.
Duk da cewa mafi yawan ’yan fasahar fasaha suna duba hotunan da aka ɗora a cikin gajimare, Apple yana amfani da NeuralHash algorithm akan iphone ɗin abokin ciniki don samar da ƙimar zanta na hoton da aka adana, sannan ya kwatanta shi da kwafin ƙimar hash ɗin CSAM da aka zazzage.
Jiya, wani mai haɓakawa ya yi iƙirarin cewa ya canza injin ɗin Apple's algorithm kuma ya fitar da lambar zuwa GitHub - Apple ya tabbatar da wannan da'awar.
A cikin 'yan sa'o'i bayan an fito da GitHib, masu binciken sun yi nasarar amfani da algorithm don ƙirƙirar tabbataccen ƙarya na gangan - hotuna guda biyu mabanbanta waɗanda suka haifar da ƙimar zanta iri ɗaya.Ana kiran wannan karo.
Don irin waɗannan tsarin, koyaushe ana samun haɗarin haɗuwa, saboda hash ba shakka wakilcin hoto ne mai sauƙin sauƙaƙe, amma abin mamaki ne cewa wani zai iya samar da hoton da sauri.
Hadarin da gangan a nan hujja ce kawai.Masu haɓakawa ba su da damar yin amfani da bayanan hash na CSAM, wanda zai buƙaci ƙirƙirar abubuwan karya a cikin tsarin ainihin lokaci, amma yana tabbatar da cewa hare-haren karo suna da sauƙi a ƙa'ida.
Apple ya tabbatar da cewa algorithm shine tushen tsarin nasa, amma ya gaya wa motherboard cewa wannan ba shine sigar ƙarshe ba.Kamfanin ya kuma bayyana cewa bai taba niyyar boye shi ba.
Apple ya gaya wa Motherboard a cikin imel cewa sigar da mai amfani ya bincika akan GitHub sigar gabaɗaya ce, ba sigar ƙarshe da aka yi amfani da ita don gano iCloud Photo CSAM ba.Apple ya ce ya kuma bayyana algorithm.
"NeuralHash algorithm [...] wani ɓangare ne na lambar tsarin aiki da aka sanya hannu [da] masu bincike na tsaro na iya tabbatar da cewa halinsa ya dace da bayanin," in ji wata takarda ta Apple.
Kamfanin ya ci gaba da cewa akwai wasu matakai guda biyu: gudanar da tsarin daidaitawa na sakandare (asiri) akan sabar sa, da kuma bitar hannu.
Apple ya kuma bayyana cewa bayan masu amfani da su sun wuce matakin wasan 30, algorithm na biyu wanda ba na jama'a ba wanda ke gudana akan sabar Apple zai duba sakamakon.
"An zaɓi wannan hash mai zaman kanta don ƙin yiwuwar cewa kuskuren NeuralHash ya dace da bayanan CSAM da aka ɓoye akan na'urar saboda tsangwama na hotunan da ba CSAM ba kuma ya wuce matakin da ya dace."
Brad Dwyer na Roboflow ya sami hanyar da za a iya bambanta cikin sauƙi tsakanin hotuna biyu da aka buga a matsayin hujjar ra'ayi don harin karo.
Ina sha'awar yadda waɗannan hotunan ke kallo a cikin CLIP na nau'in nau'in nau'in nau'in nau'in nau'in nau'i na OpenAI.CLIP yana aiki daidai da NeuralHash;yana ɗaukar hoto kuma yana amfani da hanyar sadarwa na jijiyoyi don samar da saitin sifofi waɗanda ke taswirar abubuwan da ke cikin hoton.
Amma cibiyar sadarwar OpenAI ta bambanta.Babban samfuri ne wanda zai iya taswira tsakanin hotuna da rubutu.Wannan yana nufin cewa za mu iya amfani da shi don fitar da bayanan hoto da ɗan adam ke fahimta.
Na gudanar da hotunan karo biyu na sama ta hanyar CLIP don ganin ko an yaudare shi ma.Amsa a takaice ita ce: a'a.Wannan yana nufin cewa Apple yakamata ya iya amfani da hanyar sadarwa ta hanyar cire fasali ta biyu (kamar CLIP) zuwa hotunan CSAM da aka gano don tantance ko na gaske ne ko na karya.Yana da matukar wahala a samar da hotuna masu yaudarar hanyoyin sadarwa guda biyu a lokaci guda.
A ƙarshe, kamar yadda aka ambata a baya, ana duba hotunan da hannu don tabbatar da cewa CSAM ne.
Wani mai bincike kan harkokin tsaro ya ce babban hadarin da ke tattare da shi shi ne duk wanda ke son ya bata wa Apple rai zai iya bayar da bayanan karya ga masu bitar dan Adam.
"A zahiri Apple ya tsara wannan tsarin, don haka aikin hash ɗin ba ya buƙatar a ɓoye shi, saboda kawai abin da za ku iya yi tare da 'wanda ba CSAM ba a matsayin CSAM' shine ku fusatar da ƙungiyar amsawar Apple tare da wasu hotuna masu lalata har sai sun aiwatar da tacewa don kawar da su. Bincike Wadancan dattin da ke cikin bututun bututun na karya ne,” Nicholas Weaver, babban mai bincike a Cibiyar Kimiyyar Kwamfuta ta kasa da kasa a Jami’ar California, Berkeley, ya shaida wa Motherboard a wata hira ta yanar gizo.
Keɓantawa al'amari ne na ƙara damuwa a duniyar yau.Bi duk rahotannin da suka shafi sirri, tsaro, da sauransu a cikin jagororin mu.
Ben Lovejoy marubucin fasaha ne na Burtaniya kuma editan EU don 9to5Mac.An san shi da ginshiƙansa da labaran diary, yana bincika ƙwarewarsa tare da samfuran Apple akan lokaci don samun ƙarin bita.Ya kuma rubuta litattafai, akwai ƙwararrun fasaha guda biyu, ƴan gajerun fina-finan almara na kimiyya da rom-com!


Lokacin aikawa: Agusta-20-2021