How will EC plans to restart the rules for digital services impact startups? – TechCrunch

A framework for Ensuring fairness in digital markets and tackling abusive behavior online is brewing in Europe, fueled by an assortment of issues and ideas, ranging from online security and the spread of disinformation, to platform accountability , data portability and the fair functioning of digital markets.
European Commission lawmakers are even turning to labor rights, spurred on by regional concerns over unfair conditions for platform workers.
On the content side, the central question is how to balance individual freedom of expression online against threats to public discourse, security and democracy from illegal or unwanted content that can be deployed inexpensively, in a manner anonymous and on a large scale to pollute a real public debate.
The age-old belief that the cure for bad speech is more speech can stumble in the face of such a scale. While illegal or harmful content can be a source of money, outrage-driven engagement is an economic incentive that is often overlooked or removed from this political debate.
Certainly the platform giants – whose business models are based on data mining in the background of Internet users in order to program their content sorting and behavioral advertising targeting (activity which, in particular, remains under regulatory control regarding EU data protection law) – prefer to define the issues as a matter of freedom of expression rather than bad business models.
But with European lawmakers open a wide consultation On the future of digital regulation, there is a chance that broader perspectives on the power of platforms will shape the coming decades online, and much more.
In search of cutting-edge standards
Over the past two decades, the EU legal framework for regulating digital services has been the Electronic Commerce Directive – a fundamental law which harmonizes the basic principles and incorporates exemptions from liability, greasing the path of cross-border electronic commerce.
In recent years, the Commission has supplemented this by putting pressure on large platforms to self-regulate certain types of content, via a Code of Conduct on Illegal Hate Speech Suppression – and another on disinformation. However, the codes lack legal bite and lawmakers continue to criticize platforms for not doing enough – nor be fairly transparent on what they do.