
Governments and major corporations think they can do whatever they want with our personal information. There are several ongoing programs that have started without anyone asking the constituents whether their biometric data (their faces) could be scanned.
In some places, people are fighting this but not everywhere. In fact, the situation may be worse than you think. Here are a few — both good and bad — examples that caught our eye…
Wales greenlights smart cameras
In March 2019, the first legal action against police use of facial recognition technology was launched in the UK by office worker Ed Bridges and the human rights group, Liberty. The challenge began after Bridges believed facial recognition cameras captured his face when he popped out for a sandwich in his lunch break. South Wales Police used those smart cameras in test trials to match people’s likeness with a database of images.
Unfortunately for all British citizens, a UK High Court ruled that the technology used by the police is not a privacy violation. This is despite the fact that thousands of people have had their biometric data taken in the past two years by South Wales police.
Independent study finds facial recognition questionable
Another legal action was launched against London’s Metropolitan Police, and in this case – the police commissioned an independent investigation to assess their facial recognition trials. Professor Pete Fussey and Dr. Darragh Murray of Essex University led that investigation, and we’re sad to say their report paints a damning picture. A number of problems have been identified with two issues standing out.
First, the Metropolitan Police failed to consider the human rights implications of their facial recognition trials. And second, it is “highly possible” the trials would not be able to prove “necessary in a democratic society” if they were challenged in the courts.
Finally, it is worth adding that after reviewing the report – the Metropolitan Police chose not to exercise its right of reply.
King’s Cross surveillance scrapped
The UK apparently can’t get enough of smart cameras, but the good news is that people are rebelling against this practice. The smart cameras were placed at a major rail hub at King’s Cross station, prompting an official investigation by the Information Commissioner’s Office. The Office warned that any business use of facial recognition technology must have an adequate legal basis, and as a result – the King’s Cross site scrapped their future plans for the technology entirely. The additional problem was the participation of the police, which earlier denied its involvement.
The two cameras, operational between 2016 and 2018, were sited on the busy King’s Boulevard, used by pedestrians and cyclists.
EU looking to regulate facial recognition
The citizens of the European Union may be more protected than their UK counterparts (after Brexit), with the European Commission mulling plans to put some regulations on the issue. For one thing, future legislation could curb the “indiscriminate use of facial recognition technology” and give EU residents the right to know when their facial recognition data is used.
Although only initial discussions have been made, the move to explicitly legislate facial recognition technology would bolster citizens’ protection above existing restrictions laid out under the EU’s general data protection regulation (GDPR).
According to a document seen by the FT, the EU wants to draw up legislation that “should set a world-standard for AI regulation” and sets “clear, predictable and uniform rules… which adequately protect individuals.”
Cities introduce facial recognition bans in the U.S.
Across the pond, several cities have started addressing the issue with straight-out bans. In May 2019, San Francisco became the first major American city to ban police use of facial recognition technology. The following month, Boston, Massachusetts suburb, Somerville, announced the same ban, followed by Oakland one month after.
That, however, is just the beginning with the American Civil Liberties Union (ACLU) currently advising lawmakers in New Jersey, California, Massachusetts, Michigan, and New York to introduce similar facial recognition surveillance bans.
But Amazon is doing something different…
Things, however, are not that bright across the whole of U.S. For instance, the tech giant Amazon has teamed up with more than 400 police departments around the country, asking them to market the Ring smart camera system, granting them access to Ring user’s video footage in return.
People found out about this scheme thanks to the efforts of investigative journalists rather than Amazon itself. That has further prompted Senator Edward J. Markey (D-Mass.) to pen a letter to Amazon requesting further details on the size and scope of their police deals.
What can you do to help ban facial recognition surveillance?
Lobbying is the way to go, to force our representatives to ban this practice across the board. Or, at the very least, to put a stringent regulation around it — like how and where facial recognition technology could be used.
To help those efforts, the NGO sector needs your support and you can join the fight by signing their petitions. One petition is run by Fight for the Future and there is also one for British citizens, run by Liberty. If you live in other places, make sure to check out with local NGOs what you can do about it now. Cause it could be too late afterward.
In the meantime, use a VPN to keep your web whereabouts hidden from prying eyes. Here’s a list of best VPN services.