Friday, March 29, 2024

News Destination For The Global Indian Community

News Destination For The Global Indian Community

TECHNOLOGY
LifeMag
Social Tech : Boon Or Bane ?

Social Tech : Boon Or Bane ?

Instead of imposing blanket rules, Governments globally can implement a regulatory model involving a negotiation with technology companies that is based on identified issues

Technology companies have fundamentally changed the way in which we obtain information, communicate, travel, transact in goods and services and consume content. However, they’ve also been responsible, in many ways, for some of the most pressing issues we face today. For example, concerns that TikTok exposes children to predatory behaviour led to the app being taken off the stores of Google and Apple; although this has since been reversed. YouTube has faced similar issues in the past. Social media platforms like Facebook and Twitter are being used to bully and harass, especially those from socially marginalised communities and to magnify extremist content and propaganda. There are multiple instances where Facebook has been involved in leaking user information, harvesting user data, using targeted advertising to influence election results and pushing political propaganda. It has, along with Whatsapp, also been used to propage “fake news” and spreading disinformation and misinformation, which has been linked to violence and deaths in India, Myanmar and Sri Lanka. Uber, Amazon and Apple have all been criticised for their lax labour standards. And Google, Facebook and Amazon have been accused of (and sometimes held liable for) anti-competitive behaviour. More generally, there is also a lack of transparency in the way these companies function and most of them have access to vast amounts of user data, which is collected, stored and sold to advertisers with minimal oversight.

Many of these issues exist and are exacerbated because some of these technology companies, especially “Big Tech” firms (generally refers to Amazon, Facebook, Alphabet, Microsoft and Apple), have large businesses and because of their significant global market share and revenue. A major reason as to why these companies were able to rapidly scale to their current sizes lies in their leveraging what is known as the “network effect”, wherein the value of certain goods or services increases as they gain more users. These platforms have also been able to invest in improving their services, especially by collecting and leveraging large pools of user data to improve their machine-learning algorithms.

While utilising network effects is not problematic by itself, the vast wealth of these firms has also meant that they could often simply buy out and integrate the products or services of competitors, or prioritise their own platforms over others and, hence, entrench their dominant positions. Furthermore, until recently, they faced virtually no regulatory oversight and were given free rein with how they chose to conduct businesses. This was, in large part, a consequence of the public support that these companies enjoyed and because they, and technology more generally, were seen as offering a way to enable access to information — providing “free” services or helping consumers obtain goods and services at lower costs. However, issues that have arisen have made it clear that there is no longer a question of whether to regulate technology companies but rather one of how best to do so.

Regulators around the world are grappling with this problem in multiple ways. The European Union (EU) introduced the General Data Protection Regulation (GDPR) to regulate the use of personal data of those in the body. Germany introduced a law requiring platforms to remove hate speech and other illegal content within 24 hours of being informed of such material. Singapore introduced a Bill that seeks to impose penalties on users and platforms for spreading “false statements of fact” in the country. Furthermore, Australia passed a Bill that forces technology companies to hand over encrypted data to the police.  More recently, the lower House of the Russian Parliament went so far as to vote to support a Bill that would essentially allow it to create its own domestic internet, ostensibly for national security reasons.

Some of these measures can have unintended consequences. For instance, the GDPR has been criticised for its broad definitions and because its stringent data protection requirements are more likely to disproportionately affect smaller companies by driving up costs, potentially stifling competition. Similarly, in India, there are concerns that regulations, often aimed at fixing issues caused by larger platforms, could significantly impair the ability of start-ups to scale their businesses. Moreover, some of the more extreme measures that have been introduced (such as those requiring building in backdoors to encryption and restricting the Internet to national boundaries) have the potential to alter the nature of the Internet itself and have wide-ranging implications for civil liberties, security and rights such as privacy.

India is also in the process of framing regulations applicable to this space. Over the course of the last year, the Government published the draft Information Technology [Intermediaries Guidelines (Amendment) Rules] 2018, the draft Personal Data Protection Bill and the draft National e-Commerce Policy. While each of these regulations was ostensibly introduced to solve specific issues and sought to regulate distinct areas of the digital economy, they also had overlaps, which affected each other. For instance, the Intermediary Rules primarily sought to address misinformation and “fake news” on social media platforms by requiring intermediaries to take certain steps, such as proactively monitoring their platforms for unlawful content. However, the definition of “intermediaries” is broad enough to encompass practically all entities from social media sites, messaging platforms, e-commerce platforms, cyber cafes, payment companies and internet service providers, thereby making these rules applicable to these entities as well. The draft e-Commerce Policy, in addition to introducing data localisation and other requirements, also seems to be conceptualising a State-controlled, community-owned-model for data, referring to it as a “collective resource” and a “national asset.” This has implications for entities beyond e-commerce companies and can impact how the right to privacy is developed in India. It also potentially runs counter to the Supreme Court’s decision in Puttaswamy vs Union of India, where privacy, framed primarily as an individual right, was held to be a fundamental right.

Technology companies can vary widely in the function and services they provide, even though they all share the attribute of providing goods and services through the internet. Therefore, in order to avoid unintended consequences and over-broad application, regulations must be narrowly crafted to address specific identified issues. Regulations must be framed in a manner that they are differentiated on certain metrics (such as the function served by the platforms, the potential impact on users and businesses and the aim sought to be achieved by regulation). One possible method for crafting regulation on this basis is by having more stringent requirements apply to companies that are of a certain size and scale and cross certain prescribed thresholds (whether measured in revenue, user or subscription base, or a combination of such other metrics). Another is by regulating intermediaries based on the function they undertake or the service they provide. However, given how digital companies integrate multiple services such as payments, chat, networking and the like onto the same platforms, this could also be challenging, and would require extensive collaboration with other regulators.

Another key component to consider in this context is the companies that are being currently regulated. It is especially important to include dialogue with technology companies in the context of the digital economy, given that platforms are best placed to understand the limits and abilities of the technologies they deploy. This is also why it might be useful, instead of just seeing regulations as a way to impose liability on companies, to also see them as a means of increasing platform accountability. A regulatory model that takes into account dialogue with these companies, and is based on principles of platform accountability, transparency, and ensuring effective redressal mechanisms may be a more effective way to address some of the challenges presented by digital platforms, than the sort of blanket regulations that are the norm today.

(The writer is junior fellow at the Esya Centre)

Writer: RK Pachauri

Courtesy: Aishwarya Giridhar

Social Tech : Boon Or Bane ?

Social Tech : Boon Or Bane ?

Instead of imposing blanket rules, Governments globally can implement a regulatory model involving a negotiation with technology companies that is based on identified issues

Technology companies have fundamentally changed the way in which we obtain information, communicate, travel, transact in goods and services and consume content. However, they’ve also been responsible, in many ways, for some of the most pressing issues we face today. For example, concerns that TikTok exposes children to predatory behaviour led to the app being taken off the stores of Google and Apple; although this has since been reversed. YouTube has faced similar issues in the past. Social media platforms like Facebook and Twitter are being used to bully and harass, especially those from socially marginalised communities and to magnify extremist content and propaganda. There are multiple instances where Facebook has been involved in leaking user information, harvesting user data, using targeted advertising to influence election results and pushing political propaganda. It has, along with Whatsapp, also been used to propage “fake news” and spreading disinformation and misinformation, which has been linked to violence and deaths in India, Myanmar and Sri Lanka. Uber, Amazon and Apple have all been criticised for their lax labour standards. And Google, Facebook and Amazon have been accused of (and sometimes held liable for) anti-competitive behaviour. More generally, there is also a lack of transparency in the way these companies function and most of them have access to vast amounts of user data, which is collected, stored and sold to advertisers with minimal oversight.

Many of these issues exist and are exacerbated because some of these technology companies, especially “Big Tech” firms (generally refers to Amazon, Facebook, Alphabet, Microsoft and Apple), have large businesses and because of their significant global market share and revenue. A major reason as to why these companies were able to rapidly scale to their current sizes lies in their leveraging what is known as the “network effect”, wherein the value of certain goods or services increases as they gain more users. These platforms have also been able to invest in improving their services, especially by collecting and leveraging large pools of user data to improve their machine-learning algorithms.

While utilising network effects is not problematic by itself, the vast wealth of these firms has also meant that they could often simply buy out and integrate the products or services of competitors, or prioritise their own platforms over others and, hence, entrench their dominant positions. Furthermore, until recently, they faced virtually no regulatory oversight and were given free rein with how they chose to conduct businesses. This was, in large part, a consequence of the public support that these companies enjoyed and because they, and technology more generally, were seen as offering a way to enable access to information — providing “free” services or helping consumers obtain goods and services at lower costs. However, issues that have arisen have made it clear that there is no longer a question of whether to regulate technology companies but rather one of how best to do so.

Regulators around the world are grappling with this problem in multiple ways. The European Union (EU) introduced the General Data Protection Regulation (GDPR) to regulate the use of personal data of those in the body. Germany introduced a law requiring platforms to remove hate speech and other illegal content within 24 hours of being informed of such material. Singapore introduced a Bill that seeks to impose penalties on users and platforms for spreading “false statements of fact” in the country. Furthermore, Australia passed a Bill that forces technology companies to hand over encrypted data to the police.  More recently, the lower House of the Russian Parliament went so far as to vote to support a Bill that would essentially allow it to create its own domestic internet, ostensibly for national security reasons.

Some of these measures can have unintended consequences. For instance, the GDPR has been criticised for its broad definitions and because its stringent data protection requirements are more likely to disproportionately affect smaller companies by driving up costs, potentially stifling competition. Similarly, in India, there are concerns that regulations, often aimed at fixing issues caused by larger platforms, could significantly impair the ability of start-ups to scale their businesses. Moreover, some of the more extreme measures that have been introduced (such as those requiring building in backdoors to encryption and restricting the Internet to national boundaries) have the potential to alter the nature of the Internet itself and have wide-ranging implications for civil liberties, security and rights such as privacy.

India is also in the process of framing regulations applicable to this space. Over the course of the last year, the Government published the draft Information Technology [Intermediaries Guidelines (Amendment) Rules] 2018, the draft Personal Data Protection Bill and the draft National e-Commerce Policy. While each of these regulations was ostensibly introduced to solve specific issues and sought to regulate distinct areas of the digital economy, they also had overlaps, which affected each other. For instance, the Intermediary Rules primarily sought to address misinformation and “fake news” on social media platforms by requiring intermediaries to take certain steps, such as proactively monitoring their platforms for unlawful content. However, the definition of “intermediaries” is broad enough to encompass practically all entities from social media sites, messaging platforms, e-commerce platforms, cyber cafes, payment companies and internet service providers, thereby making these rules applicable to these entities as well. The draft e-Commerce Policy, in addition to introducing data localisation and other requirements, also seems to be conceptualising a State-controlled, community-owned-model for data, referring to it as a “collective resource” and a “national asset.” This has implications for entities beyond e-commerce companies and can impact how the right to privacy is developed in India. It also potentially runs counter to the Supreme Court’s decision in Puttaswamy vs Union of India, where privacy, framed primarily as an individual right, was held to be a fundamental right.

Technology companies can vary widely in the function and services they provide, even though they all share the attribute of providing goods and services through the internet. Therefore, in order to avoid unintended consequences and over-broad application, regulations must be narrowly crafted to address specific identified issues. Regulations must be framed in a manner that they are differentiated on certain metrics (such as the function served by the platforms, the potential impact on users and businesses and the aim sought to be achieved by regulation). One possible method for crafting regulation on this basis is by having more stringent requirements apply to companies that are of a certain size and scale and cross certain prescribed thresholds (whether measured in revenue, user or subscription base, or a combination of such other metrics). Another is by regulating intermediaries based on the function they undertake or the service they provide. However, given how digital companies integrate multiple services such as payments, chat, networking and the like onto the same platforms, this could also be challenging, and would require extensive collaboration with other regulators.

Another key component to consider in this context is the companies that are being currently regulated. It is especially important to include dialogue with technology companies in the context of the digital economy, given that platforms are best placed to understand the limits and abilities of the technologies they deploy. This is also why it might be useful, instead of just seeing regulations as a way to impose liability on companies, to also see them as a means of increasing platform accountability. A regulatory model that takes into account dialogue with these companies, and is based on principles of platform accountability, transparency, and ensuring effective redressal mechanisms may be a more effective way to address some of the challenges presented by digital platforms, than the sort of blanket regulations that are the norm today.

(The writer is junior fellow at the Esya Centre)

Writer: RK Pachauri

Courtesy: Aishwarya Giridhar

Leave a comment

Comments (0)

Related Articles

Opinion Express TV

Shapoorji Pallonji

SUNGROW

GOVNEXT INDIA FOUNDATION

CAMBIUM NETWORKS TECHNOLOGY

Opinion Express Magazine