Demand for Clearview AI to Stop Scraping Images Grows to Include Facebook & LinkedIn
Debates are frequently held when it comes to determining what is the acceptable line to cross when developing and working with new technologies and Artificial Intelligence (AI).
However, it seems like social media giants, Facebook, LinkedIn, and Instagram are taking a stand for their users’ privacy and their own policies.
Cease and Desist
Last month Clearview AI came into the public eye after the New York Times ran an expose and noted that the company works by scraping images from across the internet and aiding law enforcement to identify people of interest in an efficient way. According to claims made by Clearview, the State Police in Indiana were able to identify a suspect in only 20 minutes.
Clearview has come back into the spotlight with Facebook “demanding” that they “stop accessing or using information from both Facebook and Instagram.” According to media giant CBS, Facebook has sent multiple letters not only demanding, but also expanding on its policies and asking for further details on Clearview’s practices.
While Linkedin has been quick in sending a formal cease and desist demand, Facebook has yet to reach that step. However, it seems that one can be on the horizon.
Ethics and Tech
When Clearview came into the public eye, even Google announced that while it had held the capabilities to develop similar technology, it refrained from acting on it. This was due to the serious concern of privacy and the grand possibility of it being a dangerous tool for those with bad intentions.
What is marketed as a way for police to have ease of access to unveil identities of criminals, can lead to a stream of ethical debates. Some cities, such as San Francisco, are taking further steps into outlawing police officials to even access apps like this.
Does this technology help catch more criminals, or would it create a new avenue for crimes to increase and criminals to get crafty? A co-director of the High Tech Law Institute at Santa Clara University, Eric Goldman, said, “The weaponization possibilities of this are endless.” From a rogue police officer to corrupt government officials, one can’t be sure where this AI tech will lead us.
In recent days while mobile apps are an everyday necessity in productivity, connection, and operations, there was also a sense that our best interests weren’t at heart. To see these companies come together and take a stand against the violation of their terms and services may be a sign that ethical policies are making their way back into these multi-billion dollar empires.
Information collection on social media sites has always been concerning. Even when the new feature of verifying your identity to unlock your phone came out, there was an uproar by users everywhere. Wondering what the phone companies and social networking sites could do with this information and other collected data was enough to bring Facebook’s, Mark Zuckerburg, to a Senate hearing last April.
When the New York Times began looking into Clearview, journalist Kashmir Hill, noticed that the company was shrouded in secrecy. Concerned, he asked local law enforcement to search him through the app, and the police received a phone call asking if the NYPD was talking to the media. This was a shocking realization that the company itself could monitor what police departments were watching, which makes anyone wonder, what could they possibly due with that information?
Social media giants are standing their ground when it comes to their policies and procedures set in place to protect their users from private enterprises like Clearview AI. The ethics of this innovative tech are still in question. Like a page from George Orwell’s 1984, many individuals are becoming more and more aware that big brother may indeed be watching. The unraveling progress of this app is sure to remain at the forefront of what is and is not ethically okay to the advancement of society.
Imagine a world where soldiers will have cyborg exoskeleton armor that will give them increased physical and mental performance. The Warrior Web program, funded by the