- School of Arts, Media and Creative Technology, University of Salford, Salford, United Kingdom
This article contributes to knowledge on EU policy for Internet intermediaries by providing a characterization and analysis of the system of governance for intermediaries set out initially in the 2000 Directive on E-Commerce and recently updated in the 2022 Digital Services Act. The article shows how the new regulatory system of the DSA, unlike its predecessor, is underpinned by a strong European public transnational network governance approach, with a very noteworthy instantiation of regulatory responsibility at the EU level in respect of the power given to the European Commission to regulate Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). This reflects an attempt to mitigate the negative consequences of a largely light touch, self-regulated environment faced by Internet intermediaries. The article contends that the EU’s new system of platform regulation creates instead a trans-European network (public regulatory dominated and epistemic regulatory actor enabled) more akin to the neoliberal model of EU telecommunications governance than the private interest self-regulatory aspirations of Internet governance specialists of the early 2000s, when the DEC was established.
Introduction
The EU has recently set out a new regulatory policy framework for online digital commerce services at the core of which sits the Digital Services Act (DSA) (European Parliament and Council, 2022), fully operational from February 2024. The DSA updates the landmark 2000 Directive on Electronic Commerce (DEC) (European Parliament and Council, 2000). A key feature of this legislation is its treatment of so-called Internet intermediaries, more commonly known as ‘platforms’. Despite its relative infancy, the DSA has elicited significant academic attention related to, variously, the due diligence obligations arising from the Act (Asensio, 2023); its broad legal and policy context (Buri and van Hoboken, 2021); its implications for online consumers (Reifa, 2022), online harms regulation (Heldt, 2022) and fundamental rights (Frosio and Geiger, 2023); the political discourse of the DSA (Schlag, 2023); its global regulatory implications (Nunziato, 2023; Tourkochoriti, 2023) and the DSA’s potential contribution to the EU’s digital single market (Sagar and Hoffmann, 2021; Hohmann and Kelemen, 2023).
To date, however, there has been no work that focuses in detail on the regulatory governance forms specified in the DSA and their significance for our understanding of the EU’s approach to the regulation of online communication, a gap in knowledge that this article seeks to close with its specific focus on the regulation of Internet intermediaries. The article undertakes a comparative exploration of the regulation of Internet intermediaries set out in the both the DEC and its successor DSA asking: what are the differences and similarities between the EU’s initial and current approaches to the governance of Internet intermediaries? What does this tell us about current understandings of the value of self- and co-regulatory approaches to online regulation in the EU? In so doing, the article provides evidence of what it argues is a highly significant change in approach to the EU’s treatment of Internet intermediaries: a shift in regulatory tone and substance from protection to responsibility-seeking. This reflects two rather different developments in online communication since the DEC’s passage. On the one hand, it shows the dominant establishment and commercial success of some very large intermediaries, in particular social network platforms. On the other, it illustrates a growing concern about potential aberrant behavior of users of intermediary services. It also raises concern about the welfare of platform users that come into contact with content hosted by intermediary service providers.
Conceptually, the article draws together literature from the adjacent fields of Internet governance and European telecommunications regulation to explain and account for the new system of governance entailed in the DSA. It shows how the original DEC was very much reflective of self-regulatory perspectives on Internet commerce regulation that pertained at the beginning of the century, in the relatively early years of Internet platforms. In the case of the DEC, the article argues that a core aim of EU digital policy at that point was to create a flexible, light touch regulatory environment in order to protect Internet intermediaries from liability and encourage their growth and that of electronic commerce more broadly. By contrast, the regulatory approach of the DSA, the article argues, is considerably more responsibility-seeking in its focus. To achieve the latter, it moves away the from the self-regulation culture and approach of Internet governance by specifying public regulatory measures more akin to regulatory governance strategies developed by the EU for the longer established telecommunications sector. Specifically, the article shows how the new regulatory system of the DSA is underpinned by a strong European public transnational network governance approach, with a very noteworthy instantiation of regulatory responsibility at the EU level in respect of the power given to the European Commission to regulate Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). This reflects an attempt to mitigate the negative consequences of a largely light touch, self-, (and even unregulated) environment faced by Internet intermediaries. The article contends that the EU’s new system of platform regulation creates instead a public regulatory dominated form of pluri-stakeholderism more akin to the neoliberal model of EU telecommunications governance than the private interest self-regulatory aspirations of Internet governance specialists of the early 2000s, that would have eschewed the telecommunication regulatory model as outmoded and state dominated. The article concludes by reflecting on the significance of this new approach for the utility of self- and co-regulation in Internet platform environments.
The internet, self-regulation and the growth of internet intermediaries
Work on the governance of the Internet emerged in earnest shortly after its popularization around the mid-to-late 1990s. As has been well cataloged, an initially relatively small scale networked environment developed by the technical pioneers of the Internet - with strong communitarian and fiercely independent characteristics - operated as a system self-governed by its users in the almost now unimaginable days when state and governmental actors and commercial players remained largely outside Internet-based communication (Mathiason, 2008). Despite its undoubted self-regulatory origins, as the Internet’s usership grew internationally at an exponential rate, its burgeoning strategic character led to evidence that the state had begun to become, controversially, a significant actor in its evolving governance landscape (Drezner, 2004). Before long, scholars were unearthing evidence of a range of governance forms, some with a co-regulatory public-private character, for example in relation to Internet country code Top Level Domains (Christou and Simpson, 2009). The growing complexity of Internet governance (Dutton, 2013) led to attempts to unpack and understand better its actors and their processual interactions (Kleinwachter, 2006; Broeders, 2015) where DeNardis (2014) recognized a distinct lack of planned governance at the international level. The growth of Internet services of various kinds – notably in the provision of access, search and hosting facilities – led to a focus on the role that their providers played in the day-to-day broad governance of the Internet (Van Eeten and Mueller, 2012).
Within this, the role of Internet intermediary service providers has come to particular prominence, not least the clutch of communication platforms that have now come to dominate in the fields of search (Google), shopping (Amazon), photo-sharing (Instagram), microblogging (X), closed user group messaging (Whatsapp) and combinations of these (Snapchat, Facebook). A vigorous debate has occurred on the extent to which these organizations can, do and should govern online communication. Relatedly, there has been consideration of the extent to which Internet intermediaries should be the subject of governance themselves, and are even governable at all. It is the case that as the Internet ‘has become an intrinsic part of the lives of the world’s population, these organizations are the conduits and sites of much of 21st century human life, since a growing proportion of the latter in its many forms is conducted through electronic means’ (Simpson, 2022). Hofmann et al. (2017) explore the challenge of recognizing the existence of – and thereafter taking decisions on – a set of processes that might be categorized as governance of the Internet. Interestingly, they put forward the idea of reflexive coordination, where everyday activities that recognizably govern users’ communicative behavior online, generate problems and thus become in need of governance themselves. This idea can be expanded to include an analysis of the consequences of ‘inaction’ and ‘detachedness’ associated with some of the roles played by Internet intermediaries, particularly those offering mere conduit and hosting services. It is within this complex and controversial debate, that the EU’s policy position on the regulation of Internet intermediaries has taken shape over more than 20 years, manifest in the Directive on E-Commerce and the Digital Services Act, in particular. The next section develops the conceptual context for the analysis of the DEC and DSA by exploring recent literature on public regulatory forms and processes. This is followed by an analysis of the approach taken by the EU to the regulation of the Internet intermediaries in the DEC in order to illustrate the predominance of a self-regulatory approach. Thereafter, the article explores the governance forms specified in the DSA to illustrate a highly significant change of approach akin to predominantly trans-European network governance where public regulatory political and epistemic actors are prominently positioned. The article moves to a conclusion by exploring some early evidence of how the new system specified in the DSA is being operationalized by the EU before reflecting on the significance of the evolution of the EU’s policy on Internet intermediaries entailed in the journey from the DEC to the DSA.
Regulatory forms and communications systems
Academic work in political science has for some years focused on a consideration of the emergence of newer, more flexible forms of regulatory governance, that are potentially useful in illuminating the character of the EU’s regulatory approach to Internet intermediaries set out in the DEC and DSA. Bevir and Phillips (2017, p. 686) urge a re-think of governance ‘not as a particular state formation but as a set of meaningful practices’ informed by various beliefs, concepts and preferences. They draw on Borzel’s (1998, p. 254) characterization of governance as ‘a set of relatively stable relationships which are of a non-hierarchical and interdependent nature linking a variety of actors, who share a common interest with regard to a policy and who exchange resources to pursue these shared interests acknowledging that cooperation is the best way to achieve common goals’.
Recent work by Kruck and Weiss (2023, p. 1209) explored extending the relevance of the well established idea of the regulatory state (Majone, 1994, 1997) ‘as a distinct combination of rules based policy instruments and expertise-based foundations of authority’ combining ‘epistemic authority and proliferating rules’ (emphasis in original). Koop and Lodge (2020, p. 1612) argue that the regulatory state has developed to entail ‘a broadening of decision- making and conceptions of regulation, a greater role for communication and outward-oriented activities, and a widening of stakeholder engagement and accountability’ where ‘regulators have moved away from practices associated with ‘responsibility’ toward practices aimed at ‘responsiveness’ to public and political concerns’ (p. 1613). The literature on experimentalist governance – whose premise is that traditional hierarchical governance has become strained in the light of uncertainty and increased variety - notes its flexibility, as well as the responsibility given to actors closely involved in the policy sector in question and emphasizes information gathering as a way of revising governance practices from experience (Sabel and Zeitlin, 2012). Monti and Rangoni (2022) have found that in digital markets, when there is evidence of uncertainty, actors will engage in more experimentalist governance forms. Sabel and Zeitlin (2008) claim that telecommunications is a sector which has displayed experimentalist governance. Bellanova and De Goede (2022) find a dearth of literature on ‘modes of practical collaboration between private moderation and public authority’ (p1320) and stress the importance of legal and technological mechanisms of co-production in respect of content moderation in the ‘digital shaping of European public space’ (1130).
The international, multi-actor character the European communication sector has led to a focus on the idea and practices of networked governance. Steingass (2020) argues that political administrative actors have engaged in beyond inter-state power brokering and across levels of governance to shape EU policy.
Westlund (2017, p. 62) notes how EU agencies and networks are structured to give control to the national level yet ‘studies indicate that intra-network behavioral patterns are not solely intergovernmental and decentralized, but are supplemented or replaced by more integrated patterns that cross national borders and levels of government’ where relationships of cooperation and interaction move outside of the control of government. By contrast, Heims (2017, p. 1117) explores the role of national regulators in EU bodies concluding that ‘national authorities…remain the bodies that hold the greatest regulatory capacities and expertise’. Nesti (2018) argues that network governance can lack visibility, be remote from its principals and maintain distance from democratic institutions which can give significant power and flexibility to the network. Arras and Braun (2017, p. 1259) focus on stakeholder involvement in the regulatory process noting that ‘rather than being independent and insulated from external pressures, as the idea of delegation suggests, EU agencies are strongly embedded in a network of stakeholders’ where the need for expertise tends to risk dependence on the regulated industry. Rimkute and Mazepus (2023) focus on the authority-legitimacy gap in EU agencies and consider the conditions under which EU level epistemic authority can work effectively.
The importance of the European Commission as a regulatory actor in the electronic communications sector has been well established (Humphreys and Simpson, 2005). Recent work by Oztas and Kreppel (2022) has re-focused on the Commission’s agenda setting power and policy influence arguing that through ‘informal networks, epistemic communities, and formal institutional decision-making rules, a myriad of other actors can shape EU legislation before and after it is formally initiated by the Commission’ (p. 409). They conclude that ‘Instead of consolidating itself as the political ‘engine of Europe’, the Commission appears to have become increasingly reliant on policy congruence with other core EU institutions’ (emphasis in original) yet they also find that ‘autonomous Commission agenda influence is far from disappearing altogether’ (p422).
Krej Laurens (2022) focuses on why EU legislators prefer the creation of a network of national regulatory authorities, with specific concern about the conditions under which new networks are created for policy enforcement ‘in contexts that are already institutionalized’ (p. 1569) finding that networks can be created to solve resourcing challenges which increased centralization would entail. Yesilkagit and Jordana (2022) focus on the idea of entangled agency, where national regulatory authorities (NRAs) have evolved to be able to participate simultaneously at national and European governance levels whilst maintaining their national angle. They note that NRAs ‘habituation’ has moved toward the European level and conclude that there is evidence of a ‘European transnational policy arena characterized by the occupation of multiple decision-making and advisory positions within key administrative bodies in the EU’ (p.1691).
Vantaggiato (2022) considers how European administrative networks may evolve through time with a focus on social capital within the networks deployed to deal with challenges of joint action, where they ‘comprise one type of actor (national regulators), from various jurisdictions, whose goal is producing commonly agreed rules and promoting their harmonization’ (p. 1632). The purpose of these networks is learning and influencing. Regulators balance interests and concerns with those of other actors in the environment that change over time and networks can evolve ‘to a single close-knit community of peers that…focuses primarily on achieving compromise in order to influence policy-making’. (p1647). Vantaggiato et al. (2021) cite evidence from the governance of European telecommunications where a European administrative network contains competence shared between national and EU level, where cooperation is voluntary and informal. They contend that even in informally constituted networks, their internal structure is equally likely (compared to highly formally structured networks) ‘to shape members’ perceptions of and engagement in the network’ (p587). Humphreys and Simpson (2008, pp. 866–867) in exploring the development of the European telecommunications regulatory framework found evidence of a ‘two-level, pluri-dimensional governance order…where a network of mostly technocratically focused actors has assumed responsibility for governance…dominated by quasi-state actors in the shape of national level NRAs and the European Commission’.
EU regulation of internet intermediaries and the DEC – self-regulation and protection from liability
In the late 1990s, the EU’s approach to the regulation of electronic commerce reflected much of the self-regulatory perspectives on Internet governance that were prevalent at the time. The DEC trained its focus on so-called information society services defined as the selling of goods online, the provision of online information and commercial communications; the provision of tools to allow search, access and retrieval of data; the transmission of data across a network; provision of network access; and hosting information. The DEC gave specific treatment to the governance of Internet Intermediaries where it argued that ‘disparities in Member States’ legislation and case law concerning liability of service providers acting as intermediaries prevent the smooth running of the internal market’. Regarding the tackling of illegal information online, the directive argued that ‘such mechanisms could be developed on the basis of voluntary agreements between all parties concerned’ (para 40).
At the time, the EU was very conscious of the need not to lose pace with developments in the Internet economy, having not been at the forefront of early aspects of the emerging governance arrangements for the Internet, related to its system of naming and addressing, for example. There was a particular concern not to put in place any legislative impediments to the growth of on line commercial activity (George Christou and Seamus Simpson, 2007) with two main consequences. First, the DEC adopted a protective approach to the role of Internet intermediaries, with a particular focus on limiting their commercial liability. Second, and related to this, intermediaries were largely left to decide the extent to which, if at all, they developed a consideration for – and took action in relation to - content for which they acted as conduits and hosts. The directive noted that its provisions related to exemptions from liability in respect of ‘the technical process of operating and giving access to a communication network over which information made available by third parties is transmitted or temporarily stored, for the sole purpose of making the transmission more efficient; this activity is of a mere technical, automatic and passive nature, which implies that the information service provider has neither knowledge of, nor control over, the information which is transmitted or stored’ (European Parliament and Council (2022): para 42). It was nevertheless noted that the liability limitation did not preclude the possibility of legal action which might relate to addressing a particular problem, that could include the removal or disabling of access to illegal information. The directive also made it clear that once a service provider became aware of illegal activity, it was required to act to remove or disable access to the information albeit ‘in the observance of the principle of freedom of expression’ (DEC 2000, para. 46). The DEC stipulated that prevention of monitoring obligations only pertained to those of what were described as a ‘general nature’, not those of a ‘specific case’ thus allowing national discretion over particular matters to take effect. In a further indication of the self-regulatory character of the legislation, the DEC noted that the European Commission and EU Member States should ‘encourage the drawing up of codes of conduct’ which would ‘not impair the voluntary nature of such codes’ (European Parliament and Council (2000): para 49).
Articles 12–15 of the DEC set out the specific provisions related to the liability of Internet intermediaries. Article 12 refers to the role of being a ‘mere conduit’ and states that service providers are not liable for information transmitted across a network as long as they have not initiated the transmission; did not select the receiver of the transmission; and did not select or modify the information being transmitted. Article 13 of the directive refers to ‘caching’, where a service provider is not ‘liable for the automatic, intermediate and temporary storage of that information’ as long as the service provider does not modify the information; complies with access conditions to the information; complies with rules in respect to updating the information; does not impede lawful use of technology to obtain data on the use of the information and removes or disables the information quickly in the light of knowledge that the initial source of the information has been removed from the network or access to it disabled or that a court or administrative authority has decreed that such removal or disablement must occur.
Article 14 of the DEC refers to ‘hosting’ where a service provider was deemed to be not liable for information stored at the request of the recipient of the service as long as it does not have knowledge of illegal activity or information and ‘is not aware of the facts or circumstances from which the illegal activity or information is apparent’ and, after it has become aware of these circumstances, acts quickly to disable or remove access to the information in question. Article 15 of the directive refers to ‘no general obligation to monitor’ where States are instructed not to ‘impose a general obligation on providers…to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity’ (European Parliament and Council (2000), article 16, para 1). However, the article does allow Member States to establish obligations on service providers related to informing designated national authorities of such activity, including information which could enable the identification of those receiving their services.
Regarding the specific modalities of governance, the DEC also made an important stipulation regarding cooperation between States in the implementation of the directive where they were asked to ‘appoint one or several contact points, whose details they shall communicate to the other Member States and to the Commission’. Here, states were asked that they ‘as quickly as possible’ provide information that might be requested by another Member State or the Commission. The looseness of the cooperation arrangement specified in the DEC is indicated by the stipulation in article 19(5) that States ‘shall encourage the communication to the Commission of any significant administrative or judicial decisions taken in their territory regarding disputes relating to information society services and practices, usages and customs relating to electronic commerce’ (author’s emphasis) and that the Commission should communicate this information to fellow Member States. Regarding sanctions to be applied for infringements of the DEC’s stipulations, the flexibility of the directive was clear from Article 20 where it was asserted that ‘Member States shall determine the sanctions applicable to infringements of national provisions adopted pursuant to this Directive and shall take all necessary measures to ensure that they are enforced’. In line with the thinking of the time, the Directive asked Member States to ensure that they did not take any measures that would discourage the use of out-of-court settlements for disputes arising from engagement between service providers and their customers, and should merely ‘encourage bodies responsible for out-of-court dispute settlement to inform the Commission of the significant decisions they take regarding information society services and to transmit any other information on the practices, usages or customs relating to electronic commerce’ (author’s emphasis) (European Parliament and Council (2000): article 17) indicating the comparatively exploratory and embryonic state of e-commerce at this time.
The new EU governance framework for internet intermediaries: from self-regulation to networked co-regulation and responsibility seeking
Whilst the DEC provided small-though-significant coverage of the role and position of Internet intermediaries, some 20 years later these organizations were, rather differently, nothing short of center-stage in the DSA whose declared aim was ‘to contribute to the proper functioning of the internal market for intermediary services’ (article 1) comprising so-called mere conduit services, caching services where there is ‘automatic, intermediate and temporary storage’ of information ‘performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request’, and hosting services comprising ‘storage of information provided by, and at the request of, a recipient of the service’ (DSA, article 3, para g).
The Act commences with specifications on the exemptions from liability afforded to intermediaries, that are largely in line with those of the DEC. Here, providers of mere conduit services are not liable for the information transmitted or accessed as long as they did not initiate the transmission, select its receiver, and select or modify the information contained in the transmission (article 4). For caching services, there is no liability as long as the service provider does not ‘modify the information, complies with conditions on access to the information’; and complies with rules related to updating the information ‘specified in a manner widely recognized and used by industry’, ‘does not interfere with the lawful use of technology, widely recognized and used by industry, to obtain data on the use of the information and acts quickly to remove or disable access to information stored on becoming aware that the information at the source of the initial transmission has been removed from the network in question or had access to it disabled or where there is an order by an administrative or judicial body for this to occur’ (article 5). For hosting services, there is no liability as long as the provider ‘does not have actual knowledge of illegal activity or illegal content and…is not aware of facts or circumstances from which the illegal activity or illegal content is apparent’ and after obtaining relevant knowledge acts quickly to remove or disable access to the illegal content in question (article 6). The DSA also noted that organizations taking ‘own initiative investigations’ would not be ineligible from the exemptions from liability specified in the Act. Service providers are required to make publicly available once per year a report on content moderation they have undertaken in the period in question.
In a significant development setting the conditions for the deployment of co-regulated EU-wide networked governance, intermediary service providers are required to nominate a single point of contact for communication with those authorities responsible for the regulation of the DSA’s provisions at national and EU level. They are also required to put in place a mechanism to allow users to notify them of any potential illegal content they may be hosting and to take decisions on this quickly. Providers are required to provide justification to service recipients affected by hosters’ decisions to restrict information provided by the recipients on the grounds of the illegality or non-compliance with the conditions of service of the content in question.
Section 3 of the Act also stipulates additional provisions for online platform providers. The latter are required to provide recipients of services access to a complaints handling system to be used in cases where the platform takes action against information provided by the service recipients on grounds of illegality or non-compliance with service terms and conditions of the platform European Parliament and Council (2022) (article 20). In line with the self-regulatory approach of its DEC forebear, service recipients have the right to use an out of court dispute settlement process to resolve disputes related to matters that remain unresolved by the internal complaints handling system. These bodies are to be certified for a 5 year period by a series of Digital Service Coordinators established in each Member State that are required to produce a report biennially on the performance of the out of court dispute settlement body. If the out of court dispute settlement body decides in favor of the platform, the DSA states that the service recipient will not have to reimburse fees and expenses of the platform in relation to the dispute unless the service recipient ‘manifestly acted in bad faith’ (European Parliament and Council (2022), article 21, paragraph 5).
This co-regulatory system of governance is further embellished by a series of so-called ‘trusted flaggers’, organizations designated by the Digital Service Coordinators of Member States. These ‘trusted flaggers’ are required to report annually on notices that they submit during the period in question. Article 23 of the DSA requires platforms to suspend the provision of their services to recipients ‘that frequently provide manifestly illegal content’ (European Parliament and Council (2022): Article 23, para 1). They are also required to submit to the Commission their decisions (with justifications) related to actions taken against service recipients for ‘inclusion in a publicly accessible machine-readable database’, indicating the future vital importance of information resources in the governance system for Internet Intermediaries that will infold across the EU.
The DSA also makes stipulations regarding the responsibilities of platforms to service recipients in relation to advertising (article 26) and recommender systems (article 27). It places responsibility on platforms that facilitate ‘consumers to conclude distance contracts with traders’ in relation to the traders in question in respect of their identity and commercial legitimacy and to assess, to the best of their ability, the reliability and completeness of this information, and to suspend traders that do not provide the required information. Platforms are also required to inform affected purchasers of an illegal product or service of the illegality of the service, the trader’s identity and any means of redress (DSA 2022: Article 32, paragraph 1).
Beyond these general stipulations, a key part of the DSA is a set of obligations for so-called very large online platforms (VLOPs) and very large online search engines (VLOSEs) that ‘have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million’ (DSA 2022: Article 33, para 1). Importantly, the designation of VLOPs and VLOSEs is the responsibility of the European Commission, which has been given a central supranational level role in their future governance through the DSA. The Act requires these organizations to undertake annual risk assessments ‘stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services’ (DSA 2022: article 34, para. 1) and to put in place ‘reasonable, proportionate and effective mitigation measures’ related to the identified risks paying particular concern to ‘fundamental rights’ (DSA 2022: article 35, para. 1). Here, the Commission and DSCs have the option to issue guidelines in respect of particular risks. The DSA stipulates a so-called ‘crisis response mechanism’ where the Commission on the recommendation of the newly established European Board for Digital Services (see below) can require VLOPs and VLOSEs to take particular rectifying measures. The platforms are required to pay for an annual independent audit of their compliance in respect of key measures of the DSA (DSA 2022: article 37) and to respond to concerns arising from the audit through the creation of an audit implementation report cataloguing actions taken or justifying inaction in respect of the audit’s recommendation. If the platforms use recommender systems, they are required to provide at least one option for each recommender system ‘not based on profiling’ (article 38).
VLOPs and VLOSEs are required, on request from DSCs, to provide access to data ‘for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union’ (DSA 2022: article 40, para 4). These platforms are also required to create a compliance function ‘independent from their operational functions and composed of one or more compliance officers’ (DSA 2022: article 41, para.1). The Act contains provisions for transparency whereby the VLOPs and VLOSEs need to provide to their DSC of establishment, the Commission, and make public, a series of reports regarding their auditing and actions that arose from it. The Act specifies charging these large platforms an annual supervisory fee to cover Commission costs related to it and costs related to database setting up and maintenance and the Act’s information sharing system.
In a more-self regulatory mode reminiscent of the DEC, the DSA noted that the Commission and the Board should encourage the creation of voluntary codes of conduct related to its application. The Act set a role for the Commission in situations of systemic risk to bring together platforms, other commercial interests and civil society bodies to set up codes of conduct and related reporting measures to address the specified risks. A significant part of the Act supports voluntary standards set by European and other international organizations in respect of a range of matters related to communication notices and templates related to the DSA, as well as advertising and the protection of minors online.
The implementation and enforcement processes of the DSA signal a significant development of approach from that in the DEC. Here, each Member State designated Digital Services Coordinator is responsible for monitoring and enforcement of matters related to the Act in the Member State in question as well as, importantly, ‘contributing to the effective and consistent supervision and enforcement of this Regulation throughout the Union’ (DSA, 2022: article 49, para. 2). The DSA stipulates that the DSCs – that were to be created by 17 Feb 2024 and to act ‘with complete independence’ - ‘shall cooperate with each other, other national competent authorities, the Board and the Commission’ (DSA, 2022: articles 49 and 50) pointing to the development of an elaborate European transnational governance network in the making. The DSCs are given significant investigatory and enforcement responsibilities (related to the power to accept commitments made, order cessations, impose fines and periodic penalties and adopt interim measures). The cross-border nature of Digital Service Coordinators’ work is likely to be very important where, for example, they are responsible for assessing complaints against service providers (in the country of the complainant) and transmitting the complaint to the DSC of establishment of the service provider in question. DSCs must draw up and make public annual reports of their activities. The DSA makes specific reference to competences in enforcement of the legislation where the ‘Member State in which the main establishment of the provider of intermediary services is located shall have exclusive powers to supervise and enforce’ the DSA except for powers specified in articles 2 (scope), 3 (definitions), and 4 (‘mere conduit’) of the Act (DSA, article 56: para. 1). The Commission has exclusive supervision and enforcement powers related to Section 5 of the DSA, in respect of VLOPs and VLOSEs.
Overall, it was noted that ‘Member States and the Commission shall supervise and enforce the provisions of this regulation in close cooperation’ (DSA 2022: article 56, para. 5) including exchange of information between DSCs and DSCs and the Commission; requests for investigation of specific service providers by DSCs to each other; and the undertaking of joint investigations by DSCs. In the case where a service provider is not established in the EU, the Commission holds power to enforce relevant aspects of the DSA, though it is not clear how this might work in practice. In cases of inaction based on requests for investigation or in the case of a disagreement of the Board, the Commission may be called on to assess the matter. The Commission shall then communicate its decision to the DSC of establishment of the service provider in question, which will then undertake an investigation taking ‘utmost account’ of the views of the Commission within 2 months of the request for the review.
Section 3 of the DSA refers to the European Board for Digital Services (the Board). This new supranational level body shall ‘advise the Digital Service Coordinators and the Commission’ to contribute to the consistent application of the Regulation; coordinate and contribute to guidelines and analysis of the Commission and DSCs; and assist the DSCs and the Commission in supervising very large online platforms. The Board comprises DSCs, chaired by the Commission in a non-voting capacity, that will also provide administrative and technical support to the Board. This a well established EU governance formula in telecommunications. Indicating how the governance of VLOPs and VLOSEs will incorporate epistemic expertise, the Board is able to invite experts and observers and can cooperate with other EU bodies and shall ‘make the results of this cooperation publicly available’ (DSA 2022: article 62, para. 5). The Board is assigned key tasks around supporting the coordination of joint investigations; supporting the analysis of reports; issuance of opinions, advice and recommendations to DSCs; provision of advice to the Commission related to Article 66 of the Act (referring to the launching of legal proceedings by the Commission); and support and promotion of the development and implementation of European standards, guidelines, reports and codes of conduct in relation to the legislation.
A very important part of the DSA sets out the Commission’s powers and responsibilities in respect of VLOPs and VLOSEs. Here, in coordination with DSCs, it required to develop EU expertise and to ‘coordinate the assessment of systemic and emerging issues across the Union in relation to VLOPs and VLOSEs (DSA 2022: article 64, para. 2). Importantly, the Commission can use its investigatory powers before initiating proceedings against a provider, on its own, as well as following a request. The Commission can request support from DSCs in investigating a possible infringement of the Act. It can undertake on site inspections with help from professional experts – a further indication of the incorporation of a complex of private epistemic and public regulatory knowledge in the new system - and can ask for national legal assistance where it encounters opposition to its proposed inspection. The Commission’s investigatory powers are thus significant and it can adopt what are termed no-compliance decisions where the VLOP/VLOSE is required to inform the Commission of measures taken to comply with any decision made by the Commission in respect of an infringement of the legislation. Further, the Commission has the power to impose fines on a VLOP/VLOSE of value ‘not exceeding 6% of its worldwide annual turnover in the preceding financial year’ (DSA 2022: article 74, para. 1) in respect of: infringement of provisions of the DSA; failure to comply with interim rectifying measures specified by the Commission; and failure to comply with a commitment made ‘binding by a decision pursuant to Article 71′. A fine not exceeding 1% of total annual income or worldwide turnover in the preceding financial year can be imposed for: supplying incorrect information, failure to reply to a request for information; failure to rectify misleading information; refusal to submit to an inspection; failure to comply with measures adopted by the Commission pursuant to Article 72 and failure to comply with conditions for access to the Commission’s file pursuant to Article 79(4) of the DSA. As will be seen in the next section of the article, the Commission has lost little time in pressing these powers into action in its regulation of VLOPs and VLOSEs.
The DSA in Article 75 sets out what is referred to as ‘enhanced supervision of remedies’ relating VLOPs and VLOSEs. Here, the VLOP or VLOSE in question is required to create an action plan to terminate or remedy any infringement found to exist, including a commitment to have an independent audit undertaken. The plan in question has to be communicated to the Board, Commission and relevant DSCs. The Board then provides its view on the action plan and the Commission shall then decide if the measures in the plan are sufficient and, if so, it will subsequently monitor the implementation of the action plan, keeping the Board and DSCs informed of this. The DSA gives the option to the Commission of imposing so-called periodic penalty payments on the VLOP or VLOSE of not more than 5% of its average daily income or worldwide turnover in the preceding financial year per day ‘calculated from the date appointed by the decision’ (DSA 2022: Article 76, para. 1) in order to ensure compliance with requests made in respect of matters such as supplying information, submitting to an inspection and compliance with legally binding commitments arising from the application of the DSA. The power afforded to the Commission in the legislation is clear from the statement in the DSA that ‘where a national court rules on a matter that is already the subject of a decision adopted by the Commission…that national court shall not take any decision which runs counter to that Commission decision’ (DSA 2022: article 82, para. 3). The Commission is required to create an information sharing system to allow communication between itself, DSCs and the Board. The remarkable significance of the extent of delegation of power to the Commission is indicated, paradoxically, by its initial operation for a period of 5 years. In addition, this arrangement can be revoked by the European Parliament or the Council, and any delegated acts taken by the Commission can enter into force only as long as there is no objection from the European Parliament and the Council.
Explaining and accounting for the new EU framework for internet intermediaries
In an analysis of the DSA, Heldt (2022, p. 80) has argued that ‘one thing…is clear: the times of self-regulation are over – at least in the EU’. Instead, the evidence of this article points firmly toward the stipulation and likely development of a public-private trans-European network governance system. This was, in fact, operationalized very soon after the passage of the DSA. Referred to specifically by the Commission as an ‘enforcement network’ (European Commission, 2024a), evidence suggests the emergence of a burgeoning governance system containing public political-administrative and subject specific epistemic actors along the lines highlighted in recent scholarship on the regulatory state in Europe. An important example of this occurred in April 2023 with the creation by the Commission of the European Centre for Algorithmic Transparency, with technical expertise aimed at working with the EU and national Member States in the implementation of the DSA. The ECAT soon signed an agreement with the French data science center Pole d’Expertise du Regulation du Numerique (European Commission, 2024a) whose focus is on issues covered by the DSA and more such agreements are likely to be put in place as the implementation of the DSA continues apace. There is also some evidence that the EU is attempting to promulgate the so-called ‘Brussels effect’ in the implementation of the DSA (Nunziato, 2023) with the signing of administrative agreements with the Australian eSafety Commissioner and the UK media regulator, Ofcom (European Commission, 2024b), a subject that goes beyond the scope of this article.
Since the DSA’s passage, the European Commission has moved swiftly to take action against VLOPs and VLOSEs, sending information requests to as many as 17 of them in January 2024 (AliExpress, Amazon Store, AppStore, Bing, Booking.com, Facebook, Google Search, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, YouTube and Zalando) to specify ‘the measures they have taken to comply with the obligation to give access…to the data that is publicly available on their online interface to eligible researchers’ (European Commission, 2024c).
It has also moved significantly beyond the task of information requests signaling strongly its intent to regulate platforms at the EU level into the future. At the beginning of 2024, it launched an investigation into TikTok in respect of potential DSA breaches related to ‘the protection of minors, advertising transparency, data access for researchers, as well as risk management of addictive design and harmful content’ (European Commission, 2024d). Less than 2 weeks before this, it was reported that Meta and TikTok had confirmed their intention to sue ‘the European Commission over an annual supervisory fee that companies listed under the DSA must pay’ (Tar, 2024a). It was reported that Meta was concerned that ‘companies that record a loss do not have to pay, even if they have a large user base or represent a greater regulatory burden, which means some companies paying nothing, leaving others to pay a disproportionate amount of the total’ (Tar, 2024b).
In December 2023, the Commission launched infringement proceedings against X regarding potential hate speech on its platform and the provision of data access to researchers. It also launched an investigation into AliExpress related to transparency in advertising and its handling of complaints (Rankin, 2024). In April 2024, the Commission launched an investigation into Meta under the DSA over potential insufficient action in respect of Russian disinformation and, in May 2024, it opened a second investigation, expressing concern ‘that systems of both Facebook and Instagram…may stimulate behavioral addictions in children, as well as create so-called ‘rabbit hole’ effects’. The proceedings are also focused on Meta’s age assurance and verification methods. In respect of the case, the EU Internal Market Commissioner, Thierry Breton, was quoted as asserting that the EU was ‘not convinced that it [Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram’ (Rankin, 2024). It is clear that concern over the nurturing of the digital economy that underpinned the treatment of Internet intermediaries in the DEC has been superseded, in the DSA, by concerns over regulating the content that they host and transmit. It has been argued that the DSA’s impact can be more significant than other similar pieces of legislation given the EU’s market size and the fact that it is now seen as ‘more influential as a regulatory power’ (Milmo, 2022).
The system of governance specified in the DSA can be regarded as a significant shift by the EU on the regulation of internet intermediaries, more reminiscent of EU telecommunications governance than the self-regulatory norms and practices of Internet governance. Much of this new system can be accounted for by recent literature on contemporary regulatory forms and practices. There is evidence of public regulatory authority within an extended understanding of the concept of the regulatory state. Here, a wide base of European stakeholders is likely to share responsibility for the governance of a vital part of the digital communication sector. The system now reflects the long standing idea of regulatory responsibility taking as much as regulatory responsiveness to public opinion, though clearly what to do about social network platforms has become an issue that has widely reached the public consciousness. What is in development bears core hallmarks of experimentalist governance, where information gathering and responsibility-giving are central. However, as noted in the article, the idea of hierarchy has far from disappeared. The governance of Internet intermediaries is thus likely to display evidence of practical collaboration between what Bellanova and De Goede (2022) describe as private moderators and public authority, utilizing legal and technical methods of regulatory co-production. The system specified by the EU has also clearly entailed within it a strong networked governance character that moves away from the self-regulatory origins of the Internet. Here, best practice sharing, resource pooling and mutual performance monitoring (after Masterbroek and Schrama, 2022) are likely to be key features of the system when fully operational. This is likely to create the entangled agency highlighted by Yesilkagit and Jordana (2022) with integrated regulatory patterns across national boundaries with a bi-level relationship between national and European actors. Here, the role of the European Commission will be vital, confirming Oztas and Kreppel’s (2022) recent assertion that the Commission’s agenda influencing is still prominent even if it relies on policy congruence with other EU actors, as evidenced in our case, by the backstop authority held by EU Member States (through the European Council) and the European Parliament. A key feature of the kind of networked governance in the new system is the tightness of its specification, which is far removed from the voluntarism and informality associated with self-regulatory approaches. This will require the development of shared competence and mutual reliance between national and EU level regulatory actors of a political administrative and techno-epistemic variety. Busuioc and Lodge (2016: 248) note how regulators can enhance their accountability by building a good reputation with a range of stakeholders. Steingass (2020) argues that a range of different actors can advocate norms to shape policy practices. The agency of policy actors depends on participation in transnational policy communities and networks (Henriksen and Seabrook, 2016) where the key is the ‘discursive construction of the context in which norms are advanced’ (p388).
Conclusion
Allen and Stockhem (2022) refer to the governance arrangements set out in the DSA as a layered enforcement regime and express concerns about how it will function, as well as ‘the potential politicization of enforcement, enforcement overreach and regulatory independence’. They note the possibility of uneven resourcing of DSCs, and potential tension between the DSC of establishment and the European Commission. They are also critical of the decision not to establish a new EU agency for the enforcement of the DSA and focus on the role of the European Commission, as a consequence. Here, oversight of the Commission, given its highly significant implementation powers is seen as insufficient. They also are concerned about the Board not having its own independent legal character that might weaken its scope to take strong action. Overall, it is argued that the ‘DSA has put in place an enforcement regime that may not have taken the leap it truly needed’ (Allen and Stockhem, 2022).
At the time of writing, it is too early to assess the performance of the trans-European network that is in development for the governance of Internet intermediaries in the EU. However, the evidence of this article suggests that the network bears much more the characteristics of EU telecommunications governance than the self-regulatory ethos and practices that underpinned the predecessor governance regime for intermediaries expressed in the DEC. Transnational regulatory networks in telecommunications have proven to be both resilient and influential, to the extent that Boeger and Corkin (2017, p. 988) have provided evidence that this network was able to play an ‘independent role’ in shaping its institutional evolution displaying in the process ‘resilient and even self-reinforcing’ characteristics. The complexity of the regulatory challenges that those charged with the task of implementing the DSA will face suggests that the development of robustness of this kind will not only be desirable but necessary into the future for the EU’s revised policy on Internet intermediaries to be considered a success.
Data availability statement
Publicly available datasets were analyzed in this study. This data can be found here: European Commission website (Directive on E-Commerce; Digital Services Act).
Author contributions
SS: Writing – original draft, Writing – review & editing.
Funding
The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.
Conflict of interest
The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Allen, Asha, and Stockhem, Ophelie (2022). A series on the EU digital services act: ensuring effective enforcement. Center for Democracy & Technology. Available at: https://cdt.org/insights/a-series-on-the-eu-digital-services-act-ensuring-effective-enforcement/
Arras, S., and Braun, C. (2017). Stakeholders wanted! Why and how European Union agencies involve non-state stakeholders. Journal of European Public Policy, 25, 1257–1275. doi: 10.1080/13501763.2017.1307438
Asensio, P. d. M. (2023). “Due diligence obligations and liability of intermediary services: the proposal for the EU digital services act” in The legal challenges of the fourth industrial revolution, 29–46.
Bellanova, R., and De Goede, M. (2022). Co-producing security: platform content moderation and European security integration. J. Common Mark. Stud. 60, 1316–1334. doi: 10.1111/jcms.13306
Bevir, M., and Phillips, R. (2017). Genealogies of European governance. Comp. Eur. Polit. 15, 685–704. doi: 10.1057/s41295-016-0080-8
Boeger, N., and Corkin, J. (2017). Institutional path-dependencies in Europe’s networked modes of governance. J. Common Mark. Stud. 55, 974–992. doi: 10.1111/jcms.12546
Borzel, T. (1998). Organizing Babylon – on the different conceptions of policy networks. Public Adm. 76, 253–273. doi: 10.1111/1467-9299.00100
Broeders, D. (2015). The public Core of the internet – An international agenda for internet governance. Amsterdam: University of Amsterdam Press.
Buri, Ilaria, and van Hoboken, Joris. (2021). ‘The digital services act (DSA) proposal: a critical overview’, Digital Services Act Observatory, Institute for Information Law, University of Amsterdam, Discussion Paper, 28 October.
Busuioc, E. M., and Lodge, M. (2016). The Reputational Basis of Public Accountability. Governance, 29: 247–263. doi: 10.1111/gove.12161
Christou, G., and Simpson, S. (2009). New governance, the internet and country code top level domains in Europe. Governance 22, 599–624. doi: 10.1111/j.1468-0491.2009.01455.x
Drezner, D. (2004). The global governance of the internet: bringing the state back in. Polit. Sci. Q. 119, 477–498. doi: 10.2307/20202392
Dutton, W. (2013). “Internet studies – the foundations of a transformative field” in The Oxford handbook of internet studies. ed. W. Dutton (Oxford: Oxford University Press), 1–26.
European Commission (2024a) Commission sends requests for information to 17 very large online platforms and search engines under the digital services act. Available at: https://digital-strategy.ec.europa.eu/en/news/commission-sends-requests-information-17-very-large-online-platforms-and-search-engines-under
European Commission (2024b) ‘The enforcement framework under the digital services act. Available at: https://digital-strategy.ec.europa.eu/en/policies/dsa-enforcement
European Commission (2024c) Commission opens formal proceedings against TikTok under the digital services act. Available at: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926
European Commission. (2024d). Commission opens formal proceedings against Meta under the digital services act related to the protection of minors on Facebook and Instagram. Available at: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2664
European Parliament and Council. (2000). Directive 2000/31/EC of the European Parliament and of the council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the internal market (directive on electronic commerce)
European Parliament and Council. (2022). ‘Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a single market for digital services amending Directive 2000/31/EC (Digital Services Act)’, 27.10.22, Brussels, OJL277/1–102
Frosio, G., and Geiger, C. (2023). Taking fundamental rights seriously in the digital services Act’s platform liability regime. Eur. Law J. 29, 31–77. doi: 10.1111/eulj.12475
George Christou and Seamus Simpson (2007) The New Electronic Economy – European Governance Strategies in a Globalising Economy. Cheltenham: Edward Elgar.
Heims, E. (2017). Regulatory co-ordination in the EU: a cross-sector comparison. J. Eur. Publ. Policy 24, 1116–1134. doi: 10.1080/13501763.2016.1206141
Heldt, A. (2022). EU Digital Services Act: The White Hope of Intermediary Regulation. In: T. Flew and F. R. Martin (eds) Digital Platform Regulation. Palgrave Global Media Policy and Business. Palgrave Macmillan, Cham.
Hofmann, J., Katzenbach, C., and Gollatz, K. (2017). Between coordination and regulation: finding the governance in internet governance. New Media Society 19, 1406–1423. doi: 10.1177/1461444816639975
Hohmann, B., and Kelemen, B. K. (2023). Is there anything new under the sun? A glance at the digital services act and the digital markets act from the perspective of digitalisation in the EU. CYELP 225 19, 224–248. doi: 10.3935/cyelp.19.2023.542
Humphreys, P., and Seamus, S. (2005) Globalisation, Convergence and European Telecommunications Regulation, Cheltenham: Edward Elgar.
Humphreys, P., and Simpson, S. (2008). Globalization, the ‘competition’ state and the rise of the ‘regulatory’ state in European telecommunications. J. Common Mark. Stud. 46, 849–874. doi: 10.1111/j.1468-5965.2008.00802.x
Kleinwachter, W. (2006). Internet co-governance: towards a multilayer multiplayer mechanism of consultation, coordination and cooperation (M3C3). E–Learning 3, 473–487.
Koop, C., and Lodge, M. (2020). British economic regulators in an age of politicisation: from the responsible to the responsive regulatory state? J. Eur. Publ. Policy 27, 1612–1635. doi: 10.1080/13501763.2020.1817127
Krej Laurens, V. (2022). Enforcing EU policies: why do EU legislators prefer new networks of national authorities and not existing EU agencies? J. Eur. Publ. Policy 29, 1568–1589. doi: 10.1080/13501763.2022.2125045
Kruck, A., and Weiss, M. (2023). The regulatory security state in Europe. Journal of European Public Policy, 30, 1205–1229. doi: 10.1080/13501763.2023.2172061
Henriksen, L. F., and Leonard, S. (2016). ‘Transnational Organizing: Issue Professionals in Environmental Sustainability Networks’, Organization 23:722–741.
Majone, G. (1994). The rise of the regulatory state in Europe. West Eur. Polit. 17, 77–101. doi: 10.1080/01402389408425031
Majone, G. (1997). From the positive to the regulatory state: causes and consequences of changes in the mode of governance. J. Publ. Policy 17, 139–167. doi: 10.1017/S0143814X00003524
Mathiason, J. (2008). Internet Governance – the New Frontier of Global Institutions, New York: Routledge.
Milmo, Dan. (2022). ‘Digital services act: inside the EU’s ambitious bid to clean up social media’, The Guardian. Available at: https://www.theguardian.com/media/2022/dec/17/digital-services-act-inside-the-eus-ambitious-bid-to-clean-up-social-media
Monti, G., and Rangoni, B. (2022). Competition policy in action: regulating tech markets with hierarchy and experimentalism. J. Common Mark. Stud. 60, 1106–1123. doi: 10.1111/jcms.13304
Nesti, G. (2018). Strengthening the accountability of independent regulatory agencies: from performance back to democracy. Comp. Eur. Polit. 16, 464–481. doi: 10.1057/cep.2016.24
Nunziato, D. C. (2023). The digital services act and the Brussels effect on platform content moderation. Chic. J. Int. Law 24, 115–128.
Oztas, B., and Kreppel, A. (2022). Power or luck? The limitations of the European Commission’s agenda setting power and autonomous policy influence. J. Common Mark. Stud. 60, 408–426. doi: 10.1111/jcms.13242
Rankin, Jennifer. (2024). EU investigates Facebook owner Meta over child safety concerns and mental health concerns, The Guardian. Available at: https://www.theguardian.com/technology/article/2024/may/16/eu-investigates-facebook-owner-meta-over-child-safety-and-mental-health-concerns
Reifa, C. (2022). Protecting vulnerable consumers in the digital single market. Eur. Business Law Rev. 33, 606–634.
Rimkute, D., and Mazepus, H. (2023). A widening authority-legitimacy gap in EU regulatory governance? An experimental study of the European medicines Agency’s legitimacy in health security regulation. J. Eur. Publ. Policy 30, 1406–1430. doi: 10.1080/13501763.2023.2171091
Sabel, C. F., and Zeitlin, J. (2012). ‘Experimentalism in the EU: Common ground and persistent differences’, Regulation and Governance, 6:, 410–26. doi: 10.1111/j.1748-5991.2012.01157.x
Sabel, C. F., and Zeitlin, J. (2008). Learning from Difference: The New Architecture of Experimentalist Governance in the European Union. European Law Journal 14: 271–327.
Sagar, Sander, and Hoffmann, Thomas. (2021). ‘Intermediary liability in the EU digital common market – from the E-commerce directive to the digital services act’, IDP. Internet, law and politics E-journal, 34, Universitat Oberta de Catalunya
Schlag, G. (2023). European Union’s regulating of social media: a discourse analysis of the digital services act. Polit. Governance 11, 168–177. doi: 10.17645/pag.v11i3.6735
Simpson, S. (2022). “Global internet governance and the digital media economy” in The sage handbook of the digital media economy. eds. T. Flew, J. Holt, and J. Thomas, 571–590.
Steingass, S. (2020). Too effective for Europe? The UK, norm advocacy and the case of EU international cooperation. J. Common Mark. Stud. 58, 384–401. doi: 10.1111/jcms.12927
Tar, Julia. (2024a). ‘EU Commission opens formal investigation into TikTok, focused on child protection’, 19 February, over digital services act fee. Available at: https://www.euractiv.com/section/platforms/news/eu-commission-opens-formal-investigation-into-tiktok-focused-on-child-protection/
Tar, Julia. (2024b). Meta and TikTok sue European Commission over digital services act fee. Availabe at: https://www.euractiv.com/section/platforms/news/meta-and-tiktok-sue-european-commission-over-digital-services-act-fee/
Tourkochoriti, I. (2023). The digital services act and the EU as the global regulator of the internet. Chic. J. Int. Law 24, 129–147.
van Eeten, M., and Mueller, M. (2012). Where is the governance in internet governance? New Media Soc. 15, 720–736. doi: 10.1177/1461444812462850
Vantaggiato, F. (2022). From learning to influence: the evolution of collaboration in European administrative networks. J. Eur. Publ. Policy 29, 1631–1655. doi: 10.1080/13501763.2022.2069843
Vantaggiato, F., Kassim, H., and Wright, K. (2021). Internal network structures as opportunity structures: control and effectiveness in the European competition network. J. Eur. Publ. Policy 28, 571–590. doi: 10.1080/13501763.2020.1737183
Westlund, N. (2017). Pooling administrative resources through EU regulatory networks. J. Eur. Publ. Policy 24, 61–80. doi: 10.1080/13501763.2015.1118147
Keywords: self-regulation, co-regulation, network governance, EU, internet intermediaries
Citation: Simpson S (2024) The limits of internet self-regulation – the EU’s policy for digital internet intermediaries. Front. Commun. 9:1454211. doi: 10.3389/fcomm.2024.1454211
Edited by:
José Sixto-García, University of Santiago de Compostela, SpainReviewed by:
Alba Silva, University of Santiago de Compostela, SpainJavier García-López, University of Murcia, Spain
Copyright © 2024 Simpson. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Seamus Simpson, cy5zaW1wc29uQHNhbGZvcmQuYWMudWs=