The European Union is poised to make 2020 a banner year for regulation of artificial intelligence (AI). Officials from the bloc floated the idea in January that they would move to rein in tech companies leveraging AI to harvest mass amounts of data. On Feb. 19, regulators made public plans to restrict machine learning technologies which affect a broad range of private and public use cases. Data-sharing rules are also likely to be included in the EU’s Digital Services Act in a move against social media platforms.
How Do Organizations Use AI?
Artificial intelligence allows companies and organizations to process massive amounts of data at a rate that far exceeds human capabilities. The technology can be trained to identify faces in a crowd, for example, or to sift through nearly a million pages of documents to identify the juiciest details.
Machine learning helps companies develop such systems by using algorithms to find patterns in a data-set. Through machine learning, AI systems continually improve themselves without the need for a human to program the advancements manually.
Facebook serves as a popular example of machine learning in use, owing in part to the defiance of its CEO to restrict the company’s business practices. As users upload more photos to the social media platform, either of themselves or people they know and machine learning helps identify them before users do. When a user decides to tag a photo, often the correct name of the individual is already at the top of the list because Facebook’s AI already knows who it is.
The company also relies heavily on AI to connect users, commonly through the “people you may know” feature. As users sync their phone and email contacts, and connect to mutual acquaintances, Facebook uses AI to accurately predict other possible friends on the website.
Facebook even uses AI to create shadow profiles for people who do not have Facebook accounts. These shadow profiles are also generated from scanning phone numbers and email addresses, then cross-referencing the data with other users who have the same contacts in their phones. When a person eventually signs up for a Facebook account, the ‘people you may know’ tool is likely to make some fairly accurate guesses.
AI can also be leveraged for more sinister activities, such as in China. The Communist Party of China recently employed AI facial recognition software to identify Uighur Muslims in Xinjiang Province. The software, forced onto smartphones across the region, also scanned photo albums, emails, and other data on the devices in order to perform “threat” assessments. Results from the system have been used as a basis to detain an estimated one million members of the ethnic group in concentration camps.
Why is AI a Concern for the United States and EU?
Legislators, both in the United States and the EU, are worried that companies such as Facebook are abusing privacy rights. There is a growing concern that people do not have the ability to opt out of systems processing their data, such as Facebook’s “people you may know” feature and shadow profiles. They did not agree to create an account and legally have not allowed their data, including facial identities, to be harnessed by the tech giant.
Secondly, there is a question of how the data is used by companies. Data is vital to businesses such as Facebook that leverage it for targeted advertising. The better data a media platform has, the more enticing it is for a company to advertise on it because it can hone in on users who are more likely to purchase its product. Data sharing between companies that collect it and third-party partners raises the questions of who really owns the information and did users consent to sharing it without their knowledge?
Furthermore, there are legitimate security concerns. After a massive Facebook data leak exposing information of practically every user, investigations made clear the company culture did not prioritize securing its servers.
Then there is a question of who is liable for content posted on social media platforms. Do companies such as Facebook have a responsibility to police content and could they be held legally culpable for posts on their platforms?
Finally, there are antitrust allegations against major tech companies. A recent EU case resulted in Google suffering a $9 billion fine. Companies with large data-sets can use it to promote its own products over those of its rivals. In Google’s case, the European Court of Justice ruled it improperly used its power of the search engine to direct traffic to its own shopping ads rather than its rivals.
What Will the EU Do?
The EU will work on drafting the Digital Services Act this year. It will likely place restrictions on how companies may use AI across a broad swath of industries including medical screening and self-driving cars, WSJ reported. The legislation may also impose limits on how much data is collected, such as the number of faces in a crowd that are allowed to be identified and shown.
Regulators are also considering a mandate that humans oversee machine-learning tasks and properly disclose which data sets the data will be stored in.
To combat antitrust allegations, the EU may decide to force larger companies to share the data it collects with smaller rivals.
The EU is also seeking to boost European companies. Presently, the European Commission considers European states behind the US and China in terms of data operations. So while it would like to restrict the data practices of current tech leaders, the EU would like to build its own industry through investments totaling €20 billion per year.
“We recognize we missed the first battle, the battle of personal data,” said EU Commissioner Thierry Breton. “Europe has everything it takes to lead the ‘big data’ race, and preserve its technological sovereignty.”
In short, the EU appears miffed that companies such as Facebook continually run afoul of protecting and using the personal information of Europeans. As a response, it will crackdown on AI practices this year and look to foster European competition that will hopefully be easier to control and regulate.