Face­book announced today that it’s updat­ing its group pri­va­cy set­tings and work­ing to bet­ter mod­er­ate bad con­tent break­ing the platform’s rules. The plat­form is renam­ing its con­fus­ing pub­lic, closed, and secret group set­tings to the slight­ly more straight­for­ward pub­lic and pri­vate set­tings, with the option to make pri­vate groups vis­i­ble or hid­den to non-mem­bers. The new set­tings will also pro­vide more con­trol for admins and mem­bers, giv­ing admins more mod­er­a­tion tools and mem­bers the option to see the group’s his­to­ry and pre­view its con­tent before accept­ing or declin­ing an invi­ta­tion.

The new group set­tings are also part of the Safe Com­mu­ni­ties Ini­tia­tive that the com­pa­ny start­ed two years ago, in an effort to mon­i­tor and detect bad con­tent in Face­book groups. The announce­ment comes in the wake of recent find­ings that secret Face­book groups have been act­ing as gath­er­ing places for racist, offen­sive activ­i­ty — one exam­ple com­ing from ear­li­er last month, when ProP­ub­li­ca found a group of Bor­der Patrol agents jok­ing about migrant deaths.

The name change itself isn’t like­ly to stop any bad behav­ior, as secret groups will still be around. Closed groups, which only let cur­rent mem­bers view group con­tent and see who else is in the group, will now be labeled as pri­vate but vis­i­ble groups. Secret groups, which are hid­den from search, but still require an invi­ta­tion to join, will be changed to a pri­vate and hid­den group.

Face­book says it uses AI and machine learn­ing to “proac­tive­ly detect bad con­tent before any­one reports it, and some­times before peo­ple even see it.” The flagged con­tent then gets reviewed by humans to see if it vio­lates Facebook’s Com­mu­ni­ty Stan­dards, but clear­ly, the sys­tem is flawed if offen­sive groups are still fly­ing under the radar.

In April, Face­book updat­ed its poli­cies to hold admins to high­er stan­dards, com­mit­ting to penal­ize the over­all group if mod­er­a­tors approve posts that break the platform’s rules. To make sure admins will be held respon­si­ble for their groups’ behav­ior, they’ll have access to a new tool called Group Qual­i­ty, which gives them an overview of con­tent that vio­lates Com­mu­ni­ty Stan­dards. Admins will also have an option to share what rules were bro­ken when they decline pend­ing posts, remove com­ments, or mute mem­bers.

Face­book

Source link