• light
  • dark

share using

more videos in this group

  • Ureka.org - Steem Powered Hybrid Social Network Has a New Look! :)I'm happy to announce that after a few days of intense coding, the ureka.org has updated visual themes, a new homepage and various other improvements. All feedback is welcome in the comments below!For those who are unaware, Ureka.org...
  • Steem is several years old now, it's a blockchain based social networking eco-system that pays you to post and comment - plus has a plethora of aother benefits. Despite it's greatness, it has also had it's fair share of problems, but a recent update to the network appears to have made a big shift...
total: 2
more

liking what we do here?

this site is advert free. your donations assist with keeping us online - click below to help us meet our technology costs

donations

How Facebook Decides If You See Nudity or Death (HBO)

    ura soul
    added by ura soul

    Facebook employs 4,500 content moderators around the world. Moderators get two weeks of training and a stack of manuals to help them police the site for racism, misogyny, violence, and pornography. VICE’s partners at The Guardian obtained more than a hundred of these manuals and they offer the first-ever look at the sometimes logical, sometimes inexplicable ways Facebook asks a few thousand people to help patrol its close to 2 billion users. This segment is part of the May 23rd VICE News Tonight episode. Watch VICE News Tonight on HBO Mondays through Thursdays at 7:30 PM ET. Subscribe to VICE News here: http://bit.ly/Subscribe-to-VICE-News Check out VICE News for more: http://vicenews.com Follow VICE News here: Facebook: https://www.facebook.com/vicenews Twitter: https://twitter.com/vicenews Tumblr: http://vicenews.tumblr.com/ Instagram: http://instagram.com/vicenews More videos from the VICE network: https://www.fb.com/vicevideo