Download
twelve distinguished subgroups will thoroughly n.
Skip this Video
Loading SlideShow in 5 Seconds..
Upload your PowerPoint Presentation - SlideServe PowerPoint Presentation
Download Presentation
Upload your PowerPoint Presentation - SlideServe

Upload your PowerPoint Presentation - SlideServe

4 Views Download Presentation
Download Presentation

Upload your PowerPoint Presentation - SlideServe

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Twelve distinguished subgroups will thoroughly investigate Facebook’s private data to pinpoint the causes of this global phenomenon There is nothing more representative of our world than Facebook. As of June 2019, there are 2.41 billion active users on Facebook. That’s 68.7% of the global population. That’s most of the global population. This mere fact makes Facebook one of the biggest influencers in the world. There are many people who actually live vicariously through Facebook. They get most of their information from Facebook. In just that sense, Facebook is influencing democracy around the world. The term ‘Fake News’ popularized by President Donald Trump is something that essentially gained momentum on Facebook. People who didn’t read the news saw radicalizing posts on Facebook curated by using polarization as an engagement driver. These radicalizing posts speak to the inherent biases people hide and just end up influencing their opinions. There’s this Pandora’s Box of targeted ads as well wherein any paying advertiser can create ‘Filter Bubbles’ which specifically targets ads as per age, gender, interests, country, region, etc. and generate leads, views, clicks and conversions. Facebook’s algorithm it seems tends to use polarization as a content driver. They post a lot of outrage stories because people look for and click on outrage stories. When the Cambridge Analytica Scandal broke in 2018, it wasn’t just a matter of speculation. Facebook influenced democratic votes in the United States with their 2016 Presidential Election and the United Kingdom with Brexit referendum in 2016. There were instant outrage and threats of legal action. But how do you prosecute something that’s difficult to understand? To fix that, several renowned researchers got access to internal data from Facebook. This highly sensitive dataset was analyzed to answer the most pressing social questions of our day and age. Analysis of the data provided some incredible insights. The key genres of the question include disinformation, politics, social media and technology. Facebook obviously used user behavior data and targeted content geared to make people click on things. The actual question is that can (did) Facebook swing elections? The first group that received the data divided the questions into twelve subgroups.

  2. The effect of Facebook content on the 2017 Congressional elections in Chile which saw the conservative Chile Vamos coalition come to power will be analyzed by the Pontifical Catholic University of Chile. The effect of partisan ‘news’ widely available on Facebook and the steady rise of populist parties in Italy will be analyzed by the University of Urbino, Italy. The effect of Facebook news stories on the civic engagement will be analyzed by the researchers in Taiwan. The effect of fake news and dubious stories being shared rampantly on Facebook will be analyzed by R. Kelly Garrett, an associate professor at Ohio State University. The effect of sharing of fake news to peer groups will be analyzed by Nicholas Beauchamp’s Northeastern University research team. They’ll also try to monitor the changes in fake news shares whenever Facebook’s algorithm is altered. The exposure value of fake news is being calculated by Magdalena Saldaña’s Chilean team. The broader studies are trying to find out why people click on, read, believe and share fake news. The private and sensitive internal Facebook data made available to actually (and factually) analyze the data. Analysis of this data will help pinpoint the root cause of certain key global issues. One key expected priority is to frame the results of these studies meticulously. The main aim is to find patterns of misinformation spreading online, specifically through Facebook. The second most significant aim is to determine how algorithmic changes affected sharing behaviors of regular consumers and sharers of fake news. The third aim is to figure out why there’s a sudden spike in fake news consumption and shares when fake news has been around for at least 20 years. These studies are being overseen by an organization named Social Science One which specializes in facilitating partnership between huge data caches (Facebook) and research institutions. They also protect the researchers when they publish their analyzed results from interference (and edits) by Facebook. They also protect Facebook’s sensitive data from leaking. The analyzed dataset will include: • oshared links otargeted age range olocation of the post

  3. olocation of the Facebook users who respond to incorrect information oreadability and read ratio of the posts shared by people ofact checking associated to the read/shared posts The most significant pitfall of these studies is data privacy. The researchers as well as Facebook have pledged to not violate the privacy of any of the billions of Facebook users. Another is Facebook’s outlook on data privacy. For instance, after the Crimson Hexagon Investigation, Facebook suspended its ties with Crimson Hexagon over data harvesting claims. Allegedly, the data harvested was handed over for government surveillance via contracts with the US, Turkey, and the Kremlin. Facebook continually refuses to respond to these allegations. To limit these pitfalls, Facebook will limit data access to limited logins. Each login and its activity will be logged. There are also some layers of data protection in place. Firstly, approvals by a university’s Institutional Review Board will be stringent. Secondly, Social Science One and Facebook will put in lots of protective layers to ensure data corruption. Thirdly, the individual Facebook accounts that read and share information will not be identifiable by their URLs (Uniform Resource Locator.) Obviously, differential privacy will hinder the dataset. These smaller, random errors will corrupt the data on a minute layer but in the grand scheme of things, the data will be reliable. Fourthly, data to the researchers will be restricted by disabling data downloads and data storage – only data access via secure Facebook servers will be enabled. Fifthly, a data budget will be assigned to each researcher based on their logged values and the information they have gathered. Beta testing has already begun. These twelve subgroups and their data analysis will pave the way for the next group that’ll receive more data access. As the demographic data will stripped out, there won’t be serious data privacy concerns. If this system works, Facebook will trust more data analysis firms which might pave the way to data transparency. The world has become this cauldron of hatred these days. There is an underlying cause for that. Maybe is we follow the data; it will talk to us. Data is what you need to do analytics. Information is what you need to do business. – John Owen Data will talk if you’re willing to listen to it. – Jim Bergeson For more details : - https://opinionest.com/