Page 1

1

Resetting social  networks:  how  to  use  the  conclusions  from  the   ‘Digital  Discrimination  &  Social  Networks  Conference’.      

       

[text  based  on  the  conclusions  by  David  Casacuberta]  

Ten years   ago   we   were   at   the   same   place,   presenting   the   conclusions   of   another   European  project,  El4EI  :  E-­‐learning  for  E-­‐Incorporation.  At  that  time,  the  difficulty  was   to  transfer  the  importance  of  Internet  and  social  networks  to  the  third  sector,  to  show   that  there  were  a  whole  series  of  possibilities,  of  transformations,  of  actions  and  also  a   series  of  threats  and  issues  to  resolve.  Ten  years  later,  the  situation  has  changed  a  lot   and  is  has  been  marvellous  to  see  at  the  ICUD  conference  the  quantity  of  proposals,   variety  of  groups  and  innovative  approaches  to  this  topic.     If   we   could   choose   a   term   to   cover   the   different   offers   that   have   been   given   here,   and   attach   definitions   to   all   this   variety,   I   believe   that   it   would   be   the   idea   of   visibility   and   invisibility.  Danah  Boyd  marked  it  from  the  beginning  of  the   conference,   commenting   on   how   social   networks,   the   web,   are   basically   a   mirror   of   the   things   that   happen   in   the   physical   world:   we   can   find   all   things   good,   bad   or   regular.   The   problem   is   for   example   not   so   much   cyberbullying   in   itself,  but  the  bullying  that  already  existed  before  the  digital   networks.  When  we  will  have  the  semantic  web  we  will  have  the  semantic  bullying,  the   big  data  will  bring  the  big  bullying  and  all  these  type  of  issues.     At   the   same   time,   as   Joan   Pedregosa   was   commenting   also   in   the   introduction,   the   social   networks   amplify   these   phenomena,   it   gives   them   more   force   and   renders   invisible   other   aspects,   often   the   most   poignant.   We   may   start,   for   example,   at   the   origin   of   the   World   Wide   Web:   one   of   the   first   problems   that   appeared   was   racist   speech  and  hate  speech.  But  who  were  the  first  ones  in  finding  potential  for  spreading   problematic   contents   on   the   Internet?   For   example   the   neo-­‐Nazi   groups   who   took   advantage  of  the  lack  of  international  legislation  to  promote  revisionist  texts,  denying   the  Holocaust.  The  first  things  that  started  spreading  a  lot  on  the  Internet,  and  to  be    

This  material  has  been  produced  with  the  financial  support  of  the  Fundamental  Rights  and   Citizenship  Programme  of  the  European  Union.  The  contents  of  this  material  are  the  sole   responsibility  of  its  author  and  can  in  no  way  be  taken  to  reflect  the  views  of  the  European     Commission.   Project  code:  JUST/2011/FRAC/AG/2638  

1


discussed and   that   generated   conflicts,   were   the   websites   of   Ernst   Zundel,   with   sensationalist  arguments  to  deny  the  Holocaust.  Furthermore  the  response,  in  general,   is  censorship,  but  actually  it  turns  out  to  be  counter-­‐productive:  the  more  a  text  wants   to  be  censured,  the  more  people  are  interested  in  it  and  the  more  it  is  promoted.  If  we   pay  attention  to  all  the  presentations  during  the  two-­‐day  ICUD  conference,  we  will  see   that,   far   from   being   solved,   the   problem   has   worsened.   From   the   ‘simple’   texts   of   Holocaust  denial  we  have  come  to  all  kinds  of  insults,  threats,  bullying  and  many  more   problems.   The   solution   is   not   easy   and   I   will   try   to   address   it   briefly   at   the   end.     It  turns  out  that  it  could  be  interesting  to  concentrate  on  the  invisible  things,  since  the   visible   things   are   already   well   known.   I   would   like   to   remind   some   points   that   have   been  said  about  invisible  or  hidden  discrimination.  Danica  Radovanovic,  for  example,   has   mentioned   that   the   same   web   can   be   invisible   for   a   teenager   who   believes   that   Facebook   is   the   web   and   Google   is   used   punctually   to   consult   some   things,   and   s/he   is   not  

conscious

of

the

rest

of

possibilities

that

this

space

offers.  

A   recurring   subject   has   been   that   the   victims   wish   to   be   invisible.   It   has   been   commented   in   more   than   one   panel,   that  silence  and  silencing  is  often  an  answer  to  situations  of   bullying,  to  threats,  etc.  The  result  is  that  perpetrators  gain   strength  under  a  mantel  of  invisibility.  Facebook,  Google  and   company  are  very  unwilling  to  close  problematic  pages  and   only  after  a  lot  of  pressure  they  decided  to  do  so.     We  have  heard  in  Gavan  Titley's  presentation,  how  the  act  of  expressing  oneself  can   render   invisible   the   fact   that   an   expression   involves   an   action   and   can   generate   a   series   of   problems.  Following   the   same   line   of   thought,   we   imagine   that   putting   a   text   on   the   Internet   is   a   neutral   act   and   people   do   not   think   that   it   could   have   consequences.  And,  especially,  we  have  seen  how  this  applies  to  the  mass  media,  as   they  can  amplify  the  most  harmful  trends,  attempts  and  insults,  promoting  them  and   giving   them   mass   coverage.   From   this   point   of   view,   a   peculiar   interaction   arises   between   the   digital   means   and   the   means   of   the   masses,   and   other   ones   that   are   helping  in  a  very  complex  process.  

2

This material has been produced with the financial support of the Fundamental Rights and Citizenship Programme of the European Union. The contents of this material are the sole responsibility of its author and can in no way be taken to reflect the views of the European Commission. Project code: JUST/2011/FRAC/AG/2638


3

  Shanjay   Sharma's   has   shown   that   beyond   the   explosions   of   hatred   that   we   can   observe   in   a   more   or   less   regular   form   on   social   networks,   and   that   are   covered   regularly  by  the  press,  there  is  something  hidden,  this  concealed  discrimination  is  most   worrying,   and   it   has   been   defined   as   ‘ambient   noise’.   Many   seemingly   banal   conversations   in   social   networks   are   loaded   with   discrimination.   It   has   been   very   interesting   to   see   how   this   secret   phenomenon   can   arise   from   the   analysis   of   a   simple   hash  tag  like  #notracist.  I  imagine  that  as  these  kinds  of  studies  advance,  we  will  find   more   and   more   phenomena   and   will   render   more   visible   a   whole   series   of   invisible   issues  and  problems,  which  we  currently  do  not  discuss  openly.     Even   though   detecting   these   invisible   processes   is   important,   it   is   even   better   to   propose  solutions.  We  have  seen  at  the  ICUD  conference  some  very  interesting  ones,   which  I  will  attempt  to  summarize.  The  first  one  and  most  important,  which  has  been   mentioned  several  times,  is  to  admit  that  the  problem  is  not  in  the  technology,  it  is  not   due  to  the  networks,  but  it  surfaces  there.  If  we  manage  to  expel  cyberbullying  from   twitter,   the   only   thing   that   we   will   achieve   is   that   cyberbullying   will   resurface   in   another   place.   Just   with   prohibition   we   are   not   going   to   solve   anything.   We   need   to   change  the  mentalities  of  the  people  and  not  only  the  technologies.       But   when   focusing   on   social   networks   we   have   to   think   from   the   inside   what   solutions   can  appear.  A  generic  one,  underlined  by  Dolors  Reig,  is  the  idea  of  a  comprehensive   education.  The  way  of  solving  discrimination  is  training  and  forming  everyone.  It  is  for   example  not  simply  the  responsibility  of  girls  to  not  post  inappropriate  selfies  online,   but   also   to   teach   anyone,   including   boys,   not   to   share   and   forward   these   types   of   content.   David   Dueñas's   has   shown   us   that   it   is   complex   to   change   the   mentality   of   xenophobes,   racists   and   homophobes…   but   it   is   much   easier   to   address   the   person   who,   without   reflection,   forwards   discriminatory   content   without   engaging   with   the   implications  that  it  carries.  Sometimes,  if  a  person  is  made  aware  about  the  damage   that  s/he  is  going  to  cause  to  the  persons  who  are  ridiculed,  it  can  be  stopped.  This  is   one   effective   intervention   to   address   problems   on   social   networks:   implicating   the   individual.        

This  material  has  been  produced  with  the  financial  support  of  the  Fundamental  Rights  and   Citizenship  Programme  of  the  European  Union.  The  contents  of  this  material  are  the  sole   responsibility  of  its  author  and  can  in  no  way  be  taken  to  reflect  the  views  of  the  European     Commission.   Project  code:  JUST/2011/FRAC/AG/2638  

3


Finally, I  would  like  to  dedicate  some  thoughts  to  the  question  about  the  future  of   social  networks.  What  is  going  to  happen  and  how  can  we   prepare  before  they  get  more  powerful?  If  we  believe  in  the   press,  the  important  term  just  now  is  ‘Big  Data’,  the  idea  of  this   enormous  quantity  of  information  that  can  be  processed  and   from  which  to  obtain  information.  A  while  ago,  sociologists   defended  in  a  study  that  they  had  discovered  a  very  simple  algorithm  to  detect,  with  a   high  probability,  the  sexual  preferences  of  a  person  from  his  friends  on  Facebook.  You   could  analyse  the  amount  of  friends  that  declared  themselves  heterosexuals,   homosexuals,  bisexuals  etc.  and  from  there  you  could  establish  the  sexuality  of  a   person.  It  is  not  perfect  and  surely  there  will  be  better  tricks,  but  every  time  it  turns  

out to  be  easier  to  define  how  we  are  and  which  are  our  preferences.  Let's  think  about   everything  that  Amazon  knows  about  those  who  buy  books  online,  or  domestic   appliances,  music  etc.  The  information  that  Amazon  obtains  from  us  is  increasing  every   day.  Danah  Boyd  has  talked  about  the  great  invisible  algorithms  behind  the  big   companies  of  Internet:  Google,  the  system  that  Facebook  uses  to  detect  some  inputs   over  others,  the  personalized  online  systems,  the  profiles  that  companies  like  Amazon   have  of  us  etc.  These  are  spaces  where  we  have  no  control  and  they  are  generating   already  problems  of  discrimination.  My  intuition  is  that  these  problems  will  increase.         I  believe  that  there  are  two  big  problems  that  deserve  our  immediate  attention:   The  first  one  is  the  ostracism,  which  means,  there  is  a  whole  invisible  web  that  is  not   applying  certain  demographic  patterns,  contents,  offers  ...  and  it  remains  hidden  to  the   online  browsers.  They  are  there,  like  the  pages  that  have  millions  of  visits,  but  remain   unknown  to  the  public  eye.  And  behind  these  demographic  studies  there  are,  clearly,   discriminatory  processes.  They  bear  certain  type  of  individuals  in  mind  and  not  others,   because   they   do   not   fit   with   the   commercial   interests   of   a   given   company.   Another   problem  is  the  bubble  of  filters.  Let's  suppose  that  I  enter  Amazon  and  decide  to  buy  a   Leni   Riefenstahl   movie   about   Nazi   Germany.   It   is   very   ‘normal’   for   Amazon   to   recommend   to   me   “Mein   Kampf”,   “Holocaust   lie”   ...   and   a   whole   series   of   fascist   material   because   the   majority   of   people   who   have   bought   Riefenstahl's   movie   have   done   so.   It   is   very   easy   in   these   systems   that   a   person   who   starts   taking   following   a   curiosity   to   come   upon   more   radical   material   on   the   same   topic   and   finding   much   4  

This material has been produced with the financial support of the Fundamental Rights and Citizenship Programme of the European Union. The contents of this material are the sole responsibility of its author and can in no way be taken to reflect the views of the European Commission. Project code: JUST/2011/FRAC/AG/2638


5

  stronger  positions,  which  can  lead  to  the  adoption  of  a  radical  position.  If  one  is  a  bit   misogynous  and  enters  misogynistic'  forums  he  will  discover  that  there  are  ‘many  like   him’   and   find   his   thoughts   and   opinions   reaffirmed,   and   even   learn   more   extreme   behaviours.  It  can  be  worse  when  the  person  is  a  teenager,  still  forming  opinions  and   positions,   who   can   end   up   believing   that   a   certain   opinion   is   the   norm   and   thus   acceptable,  because  the  ‘whole  world’  in  the  forum  behaves  in  the  same  way.  Online   browsers   promote   this   behaviour   and   the   social   networks   promote   it,   because   they   tend  to  gather  persons  of  the  same  type  of  interests  for  their  own  advantage.  But  here   we   have   civil   society   and   the   third   sector   that   can   react   and   counteract   to   make   a   difference.   As   Danica   Radovanovic   said,   it   is   an   imperative   that   these   systems   are   open  source  code,  with  free  software,  so  we  can  all  know  how  they  work  and  be  able   to  ask  for  reviews  of  these  filters  to  render  the  hidden  visible.     Videos  and  details  about  the  presentations  can  be  found  at:   http://digitaldiscrimination.eu/conference/    

This  material  has  been  produced  with  the  financial  support  of  the  Fundamental  Rights  and   Citizenship  Programme  of  the  European  Union.  The  contents  of  this  material  are  the  sole   responsibility  of  its  author  and  can  in  no  way  be  taken  to  reflect  the  views  of  the  European     Commission.   Project  code:  JUST/2011/FRAC/AG/2638  

5

David Casacuberta - Conclusions of the conference  
Read more
Read more
Similar to
Popular now
Just for you