How Instagram communicates algorithmic transparency

Sukhnidh Kaur
12 min readSep 19, 2021

Six years after Instagram’s intransparent algorithmic processes first etched themselves into public memory, the platform has revealed how they function in a series of official communications. This article analyses Instagram’s posts about its algorithms from June 8, 2021 until August 19, 2021 and identifies three goals: (a) shifting agency from the algorithm to the user, (b) shifting accountability from the platform to the user, and © building trust between the user and the brand. I suggest that the platform posits its foray into algorithmic transparency as an attempt to provide agency to users. However, this reveals itself as a strategic business decision that continues to center algorithmic appeal, and in doing so, does little to meaningfully alleviate the disciplining effects of its algorithmic power. The method of algorithmic control, upon which Instagram’s economic gains rely, simply shifts from covertness to assertion.

Background
Instagram, a photo and video sharing social networking site with over 1 billion monthly users [1], was launched in April 2010 and acquired by Facebook in April 2012. For its first six years, users’ focus remained largely on concrete, feature-oriented material and design affordances [2] including filters, likes, tags, and the ability to share 1:1 aspect ratio posts to their feeds. The existence of the “Instagram algorithm” became widely known in March 2016, when the platform announced that it would implement feed personalisation, changing the nature of feed ordering from chronological streaming to algorithmic curation [3]. This change was spurred by the fact that chronological streaming led users to miss 70% of the posts in their feed, and nearly 50% of the posts from their ‘close connections’ [4]. Public responses to this announcement were overwhelmingly negative, and included counter-narratives of algorithmic hegemony, violation of user autonomy, prevalence of commercial interests, and deification of the mainstream [5].

Since then, various theories have emerged surrounding Instagram’s curation and ranking algorithms. User-generated blog posts [6], news articles [7], and marketing platforms [8] have for years identified and theorised how platform algorithms function and subsequently determine user experience. This is reflected across post titles, such as DevDiscourse’s 2019 blog “Instagram algorithm change: Users left scratching heads in confusion” [7], and Intelligencer’s 2018 article “Is Instagram Strategically Withholding My Likes?” [9]. Further theories revolve around ‘gaming’ Instagram’s algorithms in pursuit of higher visibility. Engagement pods, for example, consist of users agreeing to mutually like, comment on, share, or otherwise engage with each other’s posts to game Instagram’s algorithm into prioritizing their content [10]. Other tactics include hosting giveaway contests to engage followers [11], creating ‘follow trains’ wherein groups of people agree to follow each other [12], interspersing posts with selfies which are algorithmically prioritised [13], and leaning into video content such as IGTV and Reels instead of ‘static’ posts [14]. Alongside these hacks emerge ‘conspiracy theories’ surrounding Instagram’s algorithms. These include supposed shadowbanning of business accounts [15] and a mythical ‘reach cap’ that limits posts visibility to 7% of users’ followers [16]. Instagram’s algorithms and the lack of transparency surrounding them have been a source of great frustration and contention, and have been considered central in shaping user behaviour on the platform.

On November 25 2019, Facebook AI posted a blog titled “Powered by AI: Instagram’s Explore recommender system” [17], that highlighted systems including custom query languages, lightweight modeling techniques, and tools enabling high-velocity experimentation that constituted its ranking funnel. On June 8 2021, Adam Mosseri, head of Instagram, published a blog post titled ‘Shedding More Light on How Instagram Works’ [18]. Unlike Facebook AI’s earlier post which was aimed at developers, Instagram’s post was written in an accessible manner and aimed at everyday users. It asserted that there is no one, overarching algorithm that determines what users see and don’t see on Instagram. The blog post, which was the first in a series of posts on the issue, read: ”One of the main misconceptions we want to clear up is the existence of “The Algorithm.” Instagram doesn’t have one algorithm that oversees what people do and don’t see on the app. We use a variety of algorithms, classifiers, and processes, each with its own purpose.” [18] Mosseri provided a 2422 word deep-dive into Instagram’s ranking and curation system because, in his words, ‘it’s hard to trust what you don’t understand’. Since then, Instagram has strategically forayed into the territory of algorithmic transparency.

Method

This article documents Instagram’s official communications about its algorithms over 1 reel, 1 blog post, 4 carousel posts, and 3 IGTV videos from June 8 2021 until August 19 2021. Post titles, captions, and key themes are noted. The site of communication, title of post, post details, and date of posting are included in reverse chronological order. Posts are accessed from Instagram’s official @creators account and about.instagram.com, the platform’s official website.

Table: Instagram’s official communications surrounding algorithmic transparency from June 8 2021 until August 19 2021

Instagram’s scrupulously crafted communications surrounding its algorithms serve three broad purposes — (a) shifting agency from the algorithm to the user, (b) shifting accountability from the platform to the user, and © building trust between the user and the brand. Overarchingly, these communications suggest that Instagram’s announcement of its algorithms’ functional capabilities is part of its efforts to, alongside other initiatives like introducing privacy controls, place agency in the hands of users. From this plan emerges a question — why is the platform beginning to relinquish control now, eleven years after its conception? In his blog post ‘Shedding Light on How Instagram Works’, Adam Mosseri writes — “We want to make the most of your time, and we believe that using technology to personalize your experience is the best way to do that.” [18] While this may well be true, Instagram’s announcement highlights a broader culture of leaning into narratives of public accountability surrounding big tech in 2021. This strategic move allows Instagram to stay on top of the social media game, retaining both users and their contribution to its economic agenda — one that necessitates the maintenance of algorithmic control.

Shifting agency
Intransparent algorithms strip users of agency by subjecting them to the mysterious, ever-changing nature of platform surveillance. Content creators constantly modify their use of social media to outmaneuver perceived changes in platform algorithms [19], tactfully navigating their sense of being governed by algorithms by “playing the visibility game” [20] and fending off the constant threat of invisibility [21]. Instagram’s language across its official communications consistently shifts agency from away from the algorithm and towards the user. For example, its post on @creators on July 10, 2021 captioned “✨You✨ influence what you see (and don’t see) on Instagram!” places emphasis on the user’s role in feed curation. It suggests that algorithms simply respond to people’s behavioural cues, a fact supported by previous studies. This ‘shifted’ agency allows Instagram to mitigate users’ increasing sense of powerlessness in the face of changing algorithms that determine visibility, engagement rates, and user experience.

Shifting accountability
Instagram’s reminder that users exercise agency comes at a time when it is under criticism for the practice of “shadowbanning”. Shadowbanning occurs when accounts or posts are algorithmically ‘downgraded’ [22] and their visibility is restricted without warning. It has become a particularly contentious issue because such restriction often occurs around ‘borderline’ content [22] that may not go against Instagram’s community guidelines, but may be offensive, controversial, sexual, or politically charged. Further, the effect of shadowbanning on marginalised users is outsized [23]. In, ‘Shedding More Light on How Instagram works’, Mosseri attributes this perceived restriction to rare mistakes in content moderation and the simple fact that most users simply look at less than 50% of their feed. Instagram claims that purposeful algorithmic downgrading does not exist, but that it is working on making improvements in its communication of how content visibility works [18]. In this manner, it implies that users’ dwindling engagement is not the platform’s fault, but their own.

Building trust
In the first line of Mosseri’s blog post, the head of Instagram admits that ‘it’s hard to trust what you don’t understand’. Instagram elicits trust by revealing intangible rules and helping users achieve much-coveted algorithmic appeal. Its official communications overwhelmingly suggest — “we are on your side”. The platforms’ metaphorical “pulling back the curtain” and “shedding light’’ imply that users are being let in on a trade secret that will help them fare better in terms of engagement. In doing so, Instagram structures the narrative of algorithmic transparency around algorithmic appeal. Trust-building has an undeniable impact on brand loyalty, which further informs consumer behaviour. In the face of rising public demands of accountability from big tech, then, retaining user trust becomes critical.

Discussion
Scholars often reference Foucault’s conceptualisations of discipline [24] to structure their understanding of algorithmic control because the latter is eerily reminiscent of the former — from the pervasiveness of its power to the restraining effect on its subjects. Similarly, this paper draws on Bucher’s parallels between Foucault’s discipline and the disciplining power of algorithms, specifically her idea of participatory subjectivity. In her work on algorithmic power on Facebook, she argues that social media platform users are participatory subjects produced by algorithmic mechanisms [25]. Platforms train users to behave in specific ways in their pursuit of visibility through a system of curation and ranking-related rewards, and users take cues from this system, resultantly orienting their labour of visibility, self-presentation, and engagement towards algorithmic appeal.

Platforms govern users in order to make them reach, in Foucault’s words, their ‘full potentiality’ as ‘useful’ individuals [24, 25]. This powerful assertion gives rise to two questions — who are ‘useful individuals’ on social media platforms, and why are they useful to social media platforms? In the Facebook assemblage, Bucher contends, a useful individual is one who participates, communicates, and interacts. Similarly, a useful individual on Instagram is one who interacts and elicits, in Instagram parlance, ‘engagement’ from other users. This usefulness arises from Instagram’s reliance on user-generated data. All digital interactions — including liking, sharing, scrolling, and passively gazing — generate data that is accessible to Instagram [26]. In this way, users constantly perform the inadvertent labour of data-production. This data, upon being mined and analysed, contributes to Instagram’s understanding of how and why people make decisions surrounding the information they encounter online. This knowledge shapes the USD $100 billion valued corporation’s business strategy towards its end goal of economic gain.

Turn for a moment to an earlier assertion that platforms train users to behave in ‘specific ways’. Now that the end goal is abundantly clear, the next question that arises is — what are these ‘specific ways’? Instagram’s communications on @creators — its one-stop shop for users who produce digital content — overwhelmingly encourage users to embark upon a pursuit of visibility that produces digital interactions that subsequently generate use-value for the platform. This is seen, for example, in its use of phrases such as ‘Tips to grow your followers’ (Aug 24, 2021) ‘Follow these trends to stand out on Reels’ (Aug 13, 2021), and ‘…sharing tips for long-term engagement’ (Aug 3, 2021). For six years, these interactions have been performed by users under the gaze of an obscured set of curation and ranking algorithms. Algorithmic imaginaries [27] and folk theories [28] — that is, user-generated perceptions about algorithms that don’t necessarily mirror their functional capabilities — have shaped labour practices on Instagram. In response to the above question, then, Instagram has used both its official communications, and the existence of its algorithms, to train users to labour in specific engagement-seeking ways that contribute to its economic goals. Note here that users’ labour on Instagram does not only benefit the platform, it is often self-directed and entrepreneurial [cite]. However, a growing body of work suggests that this labour is aspirational, unevenly rewarded along the lines of gender, and inadequate in providing users with economic, social, cultural, or political capital that is proportional to the labour of visibility.

Image: 9 Ways to Beat Instagram’s Algorithm For Better Reach and More Likes, neilpatel.com

Having understood that the existence of Instagram’s algorithm is a necessary subset of its business strategy, we arrive at the nature of algorithmic control and ask a final question — is it the very existence of Instagram’s culture of algorithmic ranking, or the obscured nature of its algorithms’ functional capabilities, that has a depreciating effect on users’ agency? The answer is that both of them do, because the method of control has simply shifted from covertness to assertion. Prior to June 2021, users performed the labour of visibility [29] on Instagram by understanding its algorithms, theorising around them, and further navigating the platform by ‘gaming’, ‘hacking’, ‘outsmarting’, ‘beating’, and ‘cheating’ them. This narrative of ‘gaming the system’ legitimizes the platform’s authority within the realm of cultural production [30]. Post June 2021, users continue to perform the labour of visibility with a deeper understanding of algorithms’ functional capabilities. While Instagram provides a recourse to long-standing informational asymmetry, a newer form of control emerges through its assertion of its system of curation and ranking algorithms. Now that users have been reminded of the platform’s omnipresent gaze and made aware of how it functions, their quest for engagement need not slow down. In fact, it may become more conscious, tactful, and targeted, in their pursuit of algorithmic appeal. It is this narrative of algorithmic appeal that Instagram continues to base its business strategy on, one which allows it to maintain its authority over user behaviour.

Algorithmic power does not sustain itself on secrecy. Instead, it is how algorithms shape the world that determines their power. Neither Instagram’s algorithms, nor its encouragement of visibility labour have changed with its new announcement. The power of its algorithms remains intact, with only the method of control shifting from covertness to assertion. In this way, Instagram’s strategic, business-oriented foray into algorithmic transparency — hinged on the temptation of algorithmic appeal — does little to meaningfully alleviate the disciplining effects of its algorithmic power.

References

  1. eMarketer. 2020. Global Instagram Users 2020. Access: https://www.emarketer.com/content/global-instagram-users-2020
  2. Bucher, Taina & Helmond, Anne. (2018). The Affordances of Social Media Platforms.
  3. The New York Times. 2016. Instagram May Change Your Feed, Personalizing It With an Algorithm. Access: https://www.nytimes.com/2016/03/16/technology/instagram-feed.html
  4. Instagram. 2020. Shedding More Light on How Instagram Works. Access: https://about.instagram.com/blog/announcements/shedding-more-light-on-how-instagram-works
  5. Mahnke Skrubbeltrang, M., Grunnet, J., & Tarp, N. T. (2017). #RIPINSTAGRAM: Examining user’s counter-narratives opposing the introduction of algorithmic personalization on Instagram. First Monday, 22(4). https://doi.org/10.5210/fm.v22i4.7574
  6. Coming up roses. 2017. ​​10 HARD TRUTHS OF INSTAGRAM (INSTAGRAM ALGORITHM TALK). Access: https://cominguprosestheblog.com/instagram-algorithm-talk/
  7. Devdiscourse. 2019. Instagram algorithm change: Users left scratching heads in confusion. Access: https://www.devdiscourse.com/article/technology/674262-instagram-algorithm-change-users-left-scratching-heads-in-confusion
  8. Wordstream. 2020. The Secrets of the Instagram Algorithm — Revealed! Access: https://www.wordstream.com/blog/ws/2018/06/13/instagram-algorithm
  9. Intelligencer. 2018. “Is Instagram Strategically Withholding My Likes?” Access: https://nymag.com/intelligencer/2018/01/does-instagram-withhold-likes-to-get-users-to-open-app.html
  10. O’Meara V. Weapons of the Chic: Instagram Influencer Engagement Pods as Practices of Resistance to Instagram Platform Labor. Social Media + Society. October 2019. doi:10.1177/2056305119879671
  11. Ha, A. 2015. An Experiment: Instagram Marketing Techniques and Their Effectiveness. California Polytechnic. Access: https://digitalcommons.calpoly.edu/comssp/185/
  12. Followchain. 2020. Instagram Follow Train: Rules, Groups, Guidelines. Access: https://www.followchain.org/instagram-follow-train/
  13. Photography. 2017. PHOTOS OF FACES PERFORM ALMOST 40% BETTER ON INSTAGRAM, STUDY SHOWS. Access: https://www.diyphotography.net/photos-faces-perform-almost-40-better-instagram-study-shows/
  14. Boosted. Date unknown. 5 Instagram Myths Small Businesses Need to Stop Believing. Access: https://boosted.lightricks.com/5-instagram-myths-small-businesses-need-to-stop-believing/
  15. Buzzfeed. 2019. Relax, Instagram Is Not Limiting Your Favorite Account’s Reach To 7% Of Its Followers. Access: https://www.buzzfeednews.com/article/blakemontgomery/instagram-not-limiting-posts
  16. Facebook AI. 2019. Powered by AI: Instagram’s Explore recommender system. Access: https://ai.facebook.com/blog/powered-by-ai-instagrams-explore-recommender-system/
  17. Mosseri, A. Instagram. 2021. Shedding More Light on How Instagram Works. Access: https://about.instagram.com/blog/announcements/shedding-more-light-on-how-instagram-works
  18. Scolere L, Pruchniewska U, Duffy BE. Constructing the Platform-Specific Self-Brand: The Labor of Social Media Promotion. Social Media + Society. July 2018. doi:10.1177/2056305118784768
  19. Cotter, K. (2018). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 146144481881568. doi:10.1177/1461444818815684
  20. Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180. doi:10.1177/1461444812440159
  21. Heldt, A. 2019. Borderline speech: caught in a free speech limbo? Leibniz Institute for Media Research, Hans-Bredow-Institut, Hamburg, Germany.
  22. Middlebrook, Callie, The Grey Area: Instagram, Shadowbanning, and the Erasure of Marginalized Communities (February 17, 2020). Available at SSRN: https://ssrn.com/abstract=3539721 or http://dx.doi.org/10.2139/ssrn.3539721
  23. Foucault, M. 1977. Discipline and Punish: The Birth of the Prison. London: Allen Lane.
  24. Bucher T. Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society. 2012;14(7):1164–1180. doi:10.1177/1461444812440159
  25. Cirucci, A. 2018. A New Women’s Work: Digital Interactions, Gender, and Social Network Sites. International Journal Of Communication, 12, 23. Retrieved from https://ijoc.org/index.php/ijoc/article/view/8348/2409
  26. Bucher, T. (2016). The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44. doi:10.1080/1369118x.2016.1154
  27. Eslami, M., Karahalios, K., Sandvig, C., Vaccaro, K. Rickman, A., Hamilton, K., and Kirlik, A. 2016. First I “like” it, then I hide it: Folk Theories of Social Feeds. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 2371–2382. DOI:https://doi.org/10.1145/2858036.2858494
  28. Abidin C. Visibility labour: Engaging with Influencers’ fashion brands and #OOTD advertorial campaigns on Instagram. Media International Australia. 2016;161(1):86–100. doi:10.1177/1329878X16665177
  29. Petre, C., Duffy, B. E., & Hund, E. (2019). “Gaming the System”: Platform Paternalism and the Politics of Algorithmic Visibility. Social Media + Society, 5(4), 205630511987999. doi:10.1177/2056305119879995

--

--

Sukhnidh Kaur

Thoughts on the evolving internet, society, and gender with a sprinkle of pop culture and introspection// research fellow at microsoft