Parler is an online social networking platform marketed as a ‘free speech-driven’ alternative to sites such as Twitter, Facebook and Instagram. In 2021, it has become the most popular site among supporters of Donald Trump, who refer to it as a “conservative social network” or the “alt-tech platform”.
Ultimately, users have used this platform to share right-wing views, express support for Donald Trump and promote conspiracy theories about election fraud. However, it has also been noted for harbouring extremist and potentially dangerous content that has gone unchecked due to Parler’s inadequate measures to address it. For example, in December 2020, when many of the big tech companies (Twitter et al.) decided to delete accounts promoting violence after the Capitol Hill riots in Washington D.C., Parler chose not to take action against these accounts – a clear sign that they are lagging in terms of tackling extremist and dangerous content on their platform.
This paper will examine the measures which Parler has taken (or failed to take) about addressing dangerous content on their platform and consider potential solutions which this ‘alt-tech’ provider could implement to make its service safer for all users:
Apple Removes Parler From App Store Due to ‘Inadequate’ Measures to Address Dangerous Content
Apple recently decided to pull Parler from its App Store due to the platform’s inadequate measures to address dangerous content. This news has sparked a great deal of debate about whether or not Apple should lower the law in this area, or allow the platform to take an “every man for himself” approach.
In this article, we’ll look at why Apple took this action and how it could affect both Parler and the rest of the tech industry.
Apple’s Statement Regarding Parler
Apple released an official statement on January 9th, 2021, regarding its decision to permanently remove Parler—a social media platform known for failing to adequately address dangerous content and policies—from the App Store.
In its statement, Apple pointed out that “Parler has not taken adequate measures to address the proliferation of dangerous and harassing content” on its platform and rejected Parler’s claims that it had developed improved moderation practices and that the removal from their App Store was “political.” Apple also outlined several “unresolved issues” with Parler that ultimately led to their permanent removal:
- Failure to eliminate posts or accounts inciting violence or promoting potential dangerous unlawful activities;
- Lack of effective moderation of certain notable high-profile channels;
- Apparent inability or unwillingness to identify and remove content containing sexual solicitation;
- Refusal to follow guidance from Apple concerning implementing robust user moderation features.
Apple stressed its commitment to “providing a safe and secure environment where users can freely express their ideas without fear of violence or harassment” when making this decision. However, it also warned developers against creating apps with “the intent of providing a haven for those who will not comply with reasonable safeguards.”
Parler’s Response to Apple’s Decision
When Apple removed the Parler app from their App Store on January 10, 2021, the social media platform responded with a statement in which they criticised the tech giant and carried out some damage control to prevent further impact on their reputation.
Parler’s CEO John Matze declared that Apple continually treated them as an outcast and alleged that they were unfairly targeted due to their conservative reputation. He addressed the dangerous content found in Parler, stating that it was of utmost priority for him and his team to protect their users from harm; however, he asserted that this issue could have been resolved with “bipartisan conversation and more freedom for people of all views”.
The company then promised improvements on their content moderation policies, stating that they have already been taking steps to create a safe environment for users by:
- Actively moderating certain terms (e.g. incendiary words).
- Suspending accounts posting inappropriate material.
- Introducing better algorithms to detect hate speech.
Lastly, they asked Apple to reconsider its decision and urged them “to stand proudly for free expression within the iOS ecosystem”.
Parler’s Inadequate Measures to Address Dangerous Content
In response to the US Capitol riots in January 2021, Apple removed the social media app Parler from its App Store due to the platform’s inadequate measures to address dangerous content. Parler has since enacted its content moderation protocols, but Apple still has yet to reinstate the app.
This article will discuss the consequences of Parler’s failure to take adequate measures to address dangerous content and explore the implications of Apple’s decision to remove the app from its App Store:
- Consequences of Parler’s failure to take adequate measures to address dangerous content
- Implications of Apple’s decision to remove the app from its App Store
Lack of Moderation and Enforcement of Community Guidelines
Parler, a social media platform founded in 2018, has been criticised for inadequate measures to address dangerous content on its app. As reported by news sources, Parler does not have the resources to adequately vet and moderate posts submitted by users. The company also lacks a practical approach to enforcing their Community Guidelines.
Rather than moderate content pulled from the web before it is posted, the company has employed an approach that focuses primarily on user reports of inappropriate material. Unfortunately, this culture of user-generated moderation has allowed materials such as harassment and even threats of violence to remain on the platform without being quickly removed. Some have even raised concern over the visible presence of extremist groups such as neo-Nazis and white supremacists who use Parler’s app as an online haven for their views.
The inadequate enforcement of community guidelines has been identified as a major contributor to the escalation of violence connected to many recent historic events, including the US Capitol attack in January 2021 and other extremist-related activities throughout 2020. While Parler executives maintain that their platform supports freedom of speech, criticism towards their silence and lack of reality in dealing with dangerous content continues to mount.
Lack of Effective Content Filtering
Parler has been criticised for its lack of effective content filtering and moderation practices, allowing dangerous, hateful and unfounded conspiracy theories to spread throughout the platform. In addition, while Parler has released a few statements about its commitment to “discourage or eliminate inappropriate content”, it remains unclear how this is being implemented in practice.
So far, the platform has failed to develop adequate moderation tools or put in place policies that actively discourage users from posting potentially harmful content. For example, there is no process for reporting or flagging content that is potentially dangerous or threatening behaviour as they do not have a system in place akin to YouTube’s safety mode feature. Furthermore, though Parler advertises itself as “an open platform which strives to maintain a civil culture by requiring those who use it to adhere to the guidelines set out in the terms of service” when it comes to taking action against objectionable material posted on the platform their actions are often ineffective or non-existent.
On top of this, Parler’s Terms of Service allow users to post materials which fall short of US law and international standards on free speech. This includes posts threatening violence against others and false information about health care regulations such as COVID-19 prevention strategies. As well as advocating for violence or criminal activities reports suggest that some individuals have also taken advantage of Parler’s weak moderation regime by posting calls for terrorism attacks or anti-Semitic slurs onto their timeline with little fear of consequences from site owners given their stated dedication not interfere with user posts unless they breach US law (which can be hard if not impossible in some cases).
It is clear that despite making numerous public statements about protecting freedom of expression on the platform, Parler must do more on this front before they can be viewed as a safe environment for communication and social interaction.
Lack of Transparency in Content Moderation
Parler has been criticised for its lack of transparency in content moderation. Reports have suggested that Parler’s moderation policies and procedures are nonexistent or inadequate in addressing dangerous materials on the platform. Although Parler has stated that it removes content containing hate speech, explicit material, violent threats, and other violations of its terms of service, there have been numerous examples of these types of content remaining on the platform, suggesting a lack of enforcement efforts or proper policy guidelines.
The opacity surrounding Parler’s moderation process is concerning given that such content can have serious implications for public safety—including the potential for offline violence committed by participants using their platforms to organise and promote violence. Parler has also been accused of censorship regarding accounts managing opposing political views or challenging current industry power dynamics. These practices indicate a clear failure to embrace an open and transparent approach to content moderation that would allow users to trust the platform while providing adequate protection from online harms such as hate speech, extremist views, cyberbullying, and violent rhetoric.
tags = parler banned from apple, parler suspended, parler banned from app store, adequate measures to address the proliferation, threats to people’s safety, facilitate the illegal activities in washington dc, parler parlor store ios storekrausmashable, parler parlor ios app storekrausmashable, after parler parlor store storekrausmashable, resulted in the loss of life, injuries, and the destruction of property