On Process and Precedent: What the Removal of Parler Tells Us About Apple’s Unilateral Power

In the wake of the January 6th insurrection at the U.S. Capitol, decisions made by Big Tech companies to ban users from their platforms and collectively close their gates to one specific app, Parler, have spurred widespread debate about the issue of censorship, broadly defined. This includes what constitutes censorship and how private companies can straddle the line between allowing open, free expression while effectively moderating content in the interest of public safety. 

This is a polarizing debate, and one without clear-cut answers. While this particular episode is one with no shortage of nuance and perspective to consider, this event serves to highlight Apple’s opaque and arbitrary management of its App Store, and what threats that poses for free speech and freedom of information – not just in the U.S., but worldwide.

The Dangers of Apple Censorship

The questions of how, when, and why private companies moderate content on their online platforms is a fraught one. The instance of Parler is no exception, with some claiming its takedown constitutes corporate censorship, while others point to the violence of January 6th as evidence of the real-world harm such an app can cause. This dynamic can encourage people to focus on the nature of the content that was suppressed rather than the broader, ecosystem-wide effects that we need to simultaneously consider.

As Kara Swisher, whose interview with the Parler CEO was cited by Big Tech as justification for Parler’s removal, notes:

There is nothing that Parler was doing that companies like Facebook were not guilty of too and in larger measure and for a very long time. While I would not go as far as calling the company a scapegoat, as it did allow its system to be used in dangerous ways, it certainly got a lion’s share of the hurt that rained down on tech and that others probably deserved even more.

Similarly, regarding Amazon’s removal of Parler from its cloud service, Amazon Web Services, free expression activist Jillian York noted in a Twitter thread that “if we just give this a pass without question and fail to have a societal-level conversation about this, then we’d also better be ok with Zoom censoring Leila Khaled’s talk at SFSU too” – a reference to this event. While York noted that these two scenarios “are not the same,” it is through this similar lens of precedent and practical implications which we view the story of Parler.

And it is that “societal-level conversation” mentioned by York which we are most interested in. Since Apple removed GreatFire’s FreeWeibo iOS app in 2013, we have been monitoring how Apple manages its App Store and shedding a light on the opaque ways in which the company delists certain apps from certain country-based Apple App Stores, often at the behest of authoritarian governments. 

In the case of Parler, it was Apple, a private-sector company, who decided themselves to remove the app, with no prodding from any government. In fact, it is no stretch to say that certain actors in the federal government probably would have wanted to see the app remain available. This speaks to the level of nuance that comes with the “censorship” debate and how differently these situations play out in different contexts. 

However unsavory one might consider the Parler app to be, it is worth assessing this situation from the perspective of what it shows us about Apple’s power and the dangers that come with how the company makes these types of decisions. The Electronic Frontier Foundation (EFF) has similarly reviewed the situation to take into consideration what it means when “a group of companies comes together to ensure that forums for speech or speakers are effectively taken offline altogether.” We take a similar view while narrowing our focus to Apple. We believe it is risky as a society to hold our noses and say that, “well, in this one case, Apple was justified for x and y reasons”. The way in which Apple removed Parler from its App Stores is nonetheless troubling in terms of process and precedent. 

The Black Box of App Removal 

A warning, a twenty-four-hour deadline, and a single-page verdict is all that Apple presented to proceed with and justify its decision to remove the Parler app from all of its 175 App Stores worldwide. The opacity of the decision-making process which triggered Apple’s warning and enforcement of the company’s App Store review guidelines is concerning. While it may be tempting to look away this one time, it is not hard to imagine a future wherein a similar justification is imposed upon an app that isn’t Parler; there is, of course, “dangerous and objectionable content” on other apps, ranging from those of other social media sites to encrypted communication platforms. If we’re ok with Parler’s removal, then we should be ready for Telegram or Signal to potentially suffer the same fate for a similar justification. 

The opacity of the decision-making process which triggered Apple’s warning and the company’s review guidelines are characteristic of all cases of censorship by the Cupertino-based firm in terms of the inconsistency at play; Apple actually gave Parler more information than it usually does to developers whose apps are removed. But as is the case with the developers of, for example, apps focused on Tibet that are removed, no criteria for what would have been a satisfying plan to moderate content, was ever shared by Apple.

With such opacity, Apple can remove apps for virtually any reason, including reasons that differ from what they are telling app developers, since the company shares no details nor is obliged to demonstrate the legitimacy of their decisions.

Apple’s Walled Garden

There is another element that makes Apple’s censorship decisions especially severe relative to those of Google and even Amazon; Apple holds a monopoly over its own hardware products. When Google decides to remove an app from the Play Store, it only means that an app loses one distribution channel and platform (albeit the main one) to advertise its services and reach new users. Android users can still download the app, just not via the official Google Play Store. Indeed, such “sideloading” means a slightly more complicated installation process but nothing too restrictive, even for novice users. In Amazon’s case, although shutting down the servers used by a customer has a more direct and immediate effect – if hosting is terminated it usually means that the platform will not be available for all users of that platform across all devices – alternatives to Amazon servers exist. If an Amazon customer moves to another hosting provider, their users would likely not notice any difference. 

On the contrary, being kicked out from the Apple App Store means that an app developer immediately loses all possibilities to reach new iOS users and current users would not receive updates. This means that all owners of iOS devices who want to download a specific app that has been removed from the App Store by Apple are unable to bypass Apple’s decision. Sideloading is impossible on iOS devices without “jailbreaking” the device, and jailbreaking is a fairly complicated and risky procedure for both the device and user in terms of security. For these reasons, Apple’s censorship powers are arguably the most problematic and powerful among any Big Tech company.

The Need for Transparency

In order to prevent politically motivated removals, Apple should only be able to enforce removal within a clearly defined framework. By rewriting the company’s “App Store Review Guidelines,” and including an exhaustive list of criteria used to assess apps, Apple would significantly reduce the number of times it has to opaquely and arbitrarily curate content and services on the App Store. 

Vague paragraphs and expressions, such as the ones that follow, which give “carte blanche” to Apple to remove anything not to their taste, should be removed from the “App Store Review Guidelines”:

  • “We will reject Apps for any content or behavior that we believe is over the line. What line, you ask? Well, as a Supreme Court Justice once said, ‘I’ll know it when I see it’. And we think that you will also know it when you cross it.”
  • “1.1 Objectionable Content: Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy.”
  • “1.1.6 False information and features, …”

Additionally, Apple should produce and publish new internal procedures for reviewing Apps submitted by developers and for reviewing appeals from developers, should their apps be refused for inclusion in the App Store or removed from it. The existing, opaque procedures inevitably lead to arbitrary enforcement decisions. And while we know little about apps that are removed by Apple following government requests, we know even less about apps that are removed at Apple’s own discretion. 

The immediate, coherent and efficient way to change Apple’s behavior is to demand changes of its internal rules and in particular the end of the reign of the opacity regime that dominates Apple’s decision making. Apple censorship can only be fought by imposing a set of very specific rules, rules which likely won’t fit into any single regulatory solution if it is aimed at all of Big Tech as a whole. The Apple ecosystem is unique.

Consistent Inconsistency

This is far from the first time that Apple’s opaque App Store practices have drawn scrutiny. The case of HKmap.live, an app which crowdsources and tracks the location of protesters and police in Hong Kong, illustrates best how Apple was able to, under the guise of protecting people and abiding by the law, remove an app after pressure from Beijing.

Apple claimed that the app violated their guidelines and local laws, and that it had “been used to target and ambush police” and “threaten public safety”, without substantiating its claims. As Apple accounts for around 45 percent of the smartphone market in Hong Kong, the impact of the removal of HKmap.live was significant. 

In other cases, such as the December, 2016 removal of both the English and Chinese-language versions of the New York Times app from the China App Store, Apple directly blamed mandatory requests from the Chinese authorities to justify its censorship: 

For some time now the New York Times app has not been permitted to display content to most users in China and we have been informed that the app is in violation of local regulations. As a result, the app must be taken down off the China App Store. 

Apple provided no details as to what local regulations had been supposedly breached by the New York Times. This is especially surprising given that the New York Times operates legally in China. It has offices, legally employs people in the country, and adheres to the many stipulations and restrictions imposed by the Chinese authorities so that it can continue to operate on the ground in the country.  

According to the Tech Transparency Project, Apple has fallen short when it comes to providing details to explain similar removals for no less than 964 apps which contained politically sensitive content. It is very likely that the number of politically motivated removals is higher since Apple has removed tens of thousands of apps from the Chinese app store, including more than 47,000 apps in August 2020 alone, frequently claiming that these apps contain “porn or illegal gambling”.

In the case of both the HKmap.live and New York Times apps, Apple acted in direct response to a request of the Chinese government, albeit in different circumstances; whereas in the case of Parler, Apple acted on its own accord. What is consistent, though, is the inconsistency; in these and so many other cases, the app review process and decision to reject or remove an app from some or all of its global App Stores is utterly murky, leaving no discernable path for app developers to follow in order to ensure compliance. The only precedent worth noting is Apple’s unilateral power.  

Looking Forward

It is urgent that we take full measure of the issues posed by Apple’s policies of maximizing its financial gain to the detriment of human rights. Only by forcing Apple to define objective, non-discriminatory, and politically unbiased criteria by which it assesses apps on the App Store can Apple users and app developers understand the company’s decisions and why it makes them. Apple should set up transparent and detailed internal procedures for reviewing apps, and publicly justify its actions when it is restricting the fundamental freedoms of its users. This needs to happen if we can hope to prevent Apple from trampling on its users’ basic human rights of free speech and access to free information. 

Share it

Facebook
Twitter
LinkedIn