Microsoft has released a new tool for iden­ti­fy­ing child preda­tors who groom chil­dren for abuse in online chats. Project Artemis, based on a tech­nique Microsoft has been using on the Xbox, will now be made avail­able to oth­er online com­pa­nies with chat func­tions. It comes at a time when mul­ti­ple plat­forms are deal­ing with child preda­tors tar­get­ing kids for sex­u­al abuse by strik­ing up con­ver­sa­tions in chat win­dows.

Artemis works by rec­og­niz­ing spe­cif­ic words and speech pat­terns and flag­ging sus­pi­cious mes­sages for review by a human mod­er­a­tor. The mod­er­a­tor then deter­mines whether to esca­late the sit­u­a­tion by con­tact­ing police or oth­er law enforce­ment offi­cials. If a mod­er­a­tor finds a request for child sex­u­al exploita­tion or images of child abuse, the Nation­al Cen­ter for Miss­ing and Exploit­ed Chil­dren will be noti­fied for fur­ther action.

“There is abuse on every plat­form.”

“Some­times we yell at the plat­forms — and there is abuse on every plat­form that has online chat — but we should applaud them for putting mech­a­nisms in place,” says Julie Cor­d­ua, CEO of non­prof­it tech orga­ni­za­tion Thorn, which works to pre­vent online sex­u­al abuse of chil­dren. “If some­one says, ‘oh we don’t have abuse’ I’ll say to them, ‘well, are you look­ing?’”

In Decem­ber, The New York Times found that online chat plat­forms were fer­tile “hunt­ing grounds” for child preda­tors who groom their vic­tims by first befriend­ing them and then insin­u­at­ing them­selves into a child’s life, both online and off. Most major plat­forms are deal­ing with some mea­sure of abuse by child preda­tors, includ­ing Microsoft’s Xbox Live. In 2017, as the Times not­ed, a man was sen­tenced to 15 years in prison for threat­en­ing chil­dren with rape and mur­der over the Xbox Live chat.

Detec­tion of online child sex­u­al abuse and poli­cies for han­dling it can vary great­ly from com­pa­ny to com­pa­ny, with many of the com­pa­nies involved wary of poten­tial pri­va­cy vio­la­tions, the Times report­ed. In 2018, Face­book announced a sys­tem to catch preda­tors that looks at whether some­one quick­ly con­tacts many chil­dren and how often they’re blocked. But Face­book also has access to much more data about its users than oth­er plat­forms might.

Child preda­tors lurk in online chats to find vul­ner­a­ble kids

Microsoft’s tool is impor­tant, accord­ing to Thorn, because it’s avail­able to any com­pa­ny using chat and helps to set an indus­try stan­dard for what detec­tion and mon­i­tor­ing of preda­tors should look like, help­ing with the devel­op­ment of future pre­ven­tion tools. Chats are dif­fi­cult to mon­i­tor for poten­tial child abuse because there can be so much nuance in a con­ver­sa­tion, Cor­d­ua says.

Child preda­tors can lurk in online chat rooms to find vic­tims much like they would offline, but with much more imme­di­ate access, says Eliz­a­beth Jeglic, a pro­fes­sor of psy­chol­o­gy at John Jay Col­lege of Crim­i­nal Jus­tice in New York who has writ­ten exten­sive­ly about pro­tect­ing chil­dren from online sex­u­al abuse, in par­tic­u­lar, the often sub­tle prac­tice of groom­ing. “With­in 30 min­utes they may be talk­ing sex­u­al­ly with a child,” she says. “In per­son it’s hard­er to get access to a child, but online a preda­tor is able to go in, test the waters and if it doesn’t work, go ahead and move on to the next vic­tim.”

It doesn’t stop with one plat­form, Cor­d­ua adds. “They’ll try to iso­late the child and will fol­low them across mul­ti­ple plat­forms, so they can have mul­ti­ple exploita­tion points,” she says. A preda­tor may ask a child for a pho­to, then ratch­et up the demands to videos, increas­ing the lev­el of sex­u­al con­tent. “The child is racked with guilt and fear, and this is why the preda­tor goes across plat­forms: he can say ‘oh I know all your friends on Face­book, if you don’t send me a video I’ll send that first pho­to to every­one at your junior high.’”

“We need to get bet­ter at iden­ti­fy­ing where this is hap­pen­ing.”

Artemis has been in devel­op­ment for more than 14 months, Microsoft says, begin­ning in Novem­ber 2018 at a Microsoft “360 Cross-Indus­try Hackathon,” which was co-spon­sored by two children’s pro­tec­tion groups, the WePRO­TECT Glob­al Alliance and the Child Dig­ni­ty Alliance. A team includ­ing Roblox, Kik, Thorn, and The Meet Group worked with Microsoft on the project. It was led by Hany Farid who devel­oped the Pho­toD­NA tool for detect­ing and report­ing images of child sex­u­al exploita­tion online.

Some of the details about how the Artemis tool will work in prac­tice are unclear, how­ev­er, and are like­ly to vary depend­ing on which plat­form is using it. It’s not stat­ed whether Artemis would work with chat pro­grams that use end-to-end encryp­tion, or what steps will be tak­en to pre­vent poten­tial PTSD among mod­er­a­tors.

Thorn will be admin­is­ter­ing the pro­gram and han­dling licens­ing and sup­port to get par­tic­i­pat­ing com­pa­nies onboard­ed, Microsoft says.

Cor­d­ua says while Artemis has some ini­tial lim­i­ta­tions — it cur­rent­ly only works in Eng­lish — the tool is a huge step in the right direc­tion. Since each com­pa­ny that uses the tool can cus­tomize it for its own audi­ence (chats on gam­ing plat­forms will obvi­ous­ly vary from those on social apps), there will be ample oppor­tu­ni­ty to adapt and refine how the tool works. And, she says, it’s about time plat­forms move away from the failed prac­tices of self-polic­ing and toward pro-active pre­ven­tion of child groom­ing and abuse.

In its blog post, Microsoft adds that the Artemis tool is “by no means a panacea” but is a first step toward detect­ing online groom­ing of chil­dren by sex­u­al preda­tors, which it terms “weighty” prob­lems.

“The first step is we need to get bet­ter at iden­ti­fy­ing where this is hap­pen­ing,” Cor­d­ua says. “But all com­pa­nies that host any chat or video should be doing this or they are com­plic­it in allow­ing the abuse of chil­dren on their plat­forms.”

Pho­to by James Bare­ham / The Verge

Source link