CCTV trial in Queensland ringing privacy alarm bells
March 8, 2017 |
At the best of times Closed Circuit television (“CCTV”) needs to be used carefully, responsibly and proportionally. The danger of function creep is present, particularly when CCTVs are tied to a network and run by government agencies. The report today in the Guardian in Fears over trial of ‘1984’ surveillance system that anticipates antisocial acts highlights a highly suspect trial by Toowomba City Council to use facial recognition software and analytic software to potentially predict anti social behaviour. This story has had a run on the World Today. The vagueness of the Toowoomba Mayor, Paul Antonio’s answers as to the future possible use of the CCTVs highlights the lack of transparency that accompanies these sort of “trials.” Of almost as much concern is the fact that the Queensland Privacy Commissioner has not been consulted. More like ignored even though the Information Privacy Act applies to local government. That Commissioner has prepared, vague and inadequate guidelines, on camera surveillance which are likely to be breached if the commentary on the World Today story turns into account. In the UK, where CCTVs are ubiquitous, there is a code of practice and real consequences for its misuse.
Unfortunately councils are prone to foolish, technology will solve all problems type of solutions which tend to create an ill bigger than the problem. CCTVs, with analytical programs, are far from crime solvers or even preventers.
The Guardian article provides:
A behavioural surveillance scheme being trialled by a Queensland council in an attempt to anticipate antisocial and illegal acts has prompted concern, with civil liberties advocates saying the technology is “straight out of 1984” and has been linked to racial profiling.
The Queensland privacy commissioner, Philip Green, confirmed he had not been consulted by Toowoomba regional council over its trial of “privacy-invasive” behavioural recognition software with CCTV cameras, which has been linked to racial profiling in the United States.
Toowoomba’s foray into behavioural surveillance technology was also criticised by civil libertarians who warned that people deemed “abnormal” by an algorithm faced being harassed by authorities.
On Monday the council rolled out a month-long trial of “camera analytics software”, which purportedly links with CCTV networks to identify safety risks posed by members of the public.
The technology, sold by a company called iOmniscient, includes facial and number-plate recognition technology and touts an ability to anticipate antisocial behaviour in crowds, such as vandalism and other crimes.
Green said the council had not consulted his office about the rollout of technology despite it raising serious privacy concerns – the second time in a month a Queensland council has done so.
“Frankly I don’t know what [the scheme] is, I have no visibility on it, they haven’t consulted with us to my knowledge and I’d like to look at it,” Green told Guardian Australia.
He said “well-intended” councils were “trying to explore new technology” but in this case the move provoked “some serious debates about profiling people”.
“That’s where I think it can get scary because facial recognition’s not that accurate, mood recognition as Facebook’s trying to run out or whatever Toowoomba is trying to do with behavioural pattern recognition – all those algorithms have failures,” he said.
“There’s a wider debate that’s beyond privacy, around the adequacy of the decision-making process based on it and that’s a wider thing for artificial intelligence generally.
“If they’re using a privacy-invasive technology in the first place to base their algorithms on, there’s a few concerns across the board.”
Michael Cope, the president of the Queensland Council for Civil Liberties, said the technology was “straight out of 1984” and had been linked in the US with a tendency to over-select racial minorities.
“The algorithms which underlie this technology have at their heart arbitrary concepts of what is normal,” Cope said. “People are selected for attention by authorities on the basis of their supposedly abnormal behaviour or appearance.
“How would you feel if you are innocently minding your own business in the mall and you are approached by the police in front of everyone because a machine thinks you look odd?”
iOmniscient’s Sydney office could not be reached for comment.
But the company website details a product called IQ-120 which “has the ability to detect behaviour deemed suspicious such as loitering or running, based on the speed and pattern of their movement”.
Staff in CCTV monitoring rooms can “more closely monitor people whose behaviour seems suspicious, potentially averting crises before they happen”, it says.
The software can be customised to identify behaviour a client deems “suspicious for their situation”, it says.
It gives the example of tracking potential thieves in car parks by configuring software to “recognise that a normal person would walk directly to their car while generating an alarm for any person that is seen to walk form car to car as they decide which one to steal”.
It could also detect “aberrant behaviour” such as when “a suicidal person may loiter in one place as they build up courage to blow themselves up, or before they jump off a train platform”.
Cope called on Toowoomba council to reveal what data from its records was being “fed in to this software and what data is being fed back into the databases”, adding: “We object to the creation of vast databases.”
The use of behavioural recognition in US airports had resulted in “many innocent people being harassed by security staff”, he said.
“The problem is that when a machine makes the assessment, the pressure is on for the operator to take action lest they be accused of negligence,” he said. “The result is that the number of innocent people being accosted by the authorities will go up.”
Green said it was preferable if councils “could talk to us about it before they roll out” technology that clearly infringed on people’s privacy. This would allow a “privacy-impact analysis [because] that’s going to be the standard internationally” and an assessment of whether it was “in proportion” with its purpose.
It was important that the public was involved in a debate about the “proportionality” of such schemes, he said.
The Gold Coast city council has consulted Green’s office on a trial of facial recognition surveillance in the lead-up to the 2018 Commonwealth Games. Green said even with the security risks around such an event, given the high-level presence of law enforcement, he personally wondered “whether councils should be getting into that sort of thing”.
The commissioner is awaiting a detailed response from Moreton Bay regional council to his queries about its rollout of audio recordings with its CCTV camera network last month. He previously said he was concerned the scheme might breach the state’s Information Privacy Act, as well as criminal law in the Invasion of Privacy Act.
Cope said Queensland councils with camera surveillance had “consistently behaved as if the Information Privacy Act does not apply to them”.
On Tuesday a Toowoomba councillor, Geoff McDonald, said the trial would give other councils a chance to see potential benefits for crowd management, locating missing objects and helping to find missing people.
He said there were “potential community safety benefits through automatic or early identification of antisocial behaviour and potential safety risks”, according to the Sunshine Coast Daily.
McDonald was contacted for further comment
Notwithstanding the Toowoomba Chronicle’s title, ‘Big Brother’ cameras watch over Toowoomba, it seems to enthuse at the supposed upsides. Hardly surprising for a local paper. And the lusty predictions by one of the proponents on council, Mr McDonald, that it is the cure-all for all social ills is one-sided and highly likely to be wrong. And expensive in terms of costs and liberties lost.
It provides:
CRIMINALS, security threats, drunks and vandals will be spotted quickly in Toowoomba now that a camera system featuring a raft of technical capabilities including facial recognition has been rolled out.
Specialised camera analytics software from iOmniscient identifies potential safety risks and is being trialled in Toowoomba after an invitation from the South East Queensland Council of Mayors, Cr Geoff McDonald said.
He said the month-long trial would give councils the chance to see a working system and understand the potential benefits analytics could provide councils, government agencies and the community.
The types of software being trialled will cover functions such as people counting, crowd management and the ability to more easily identify abandoned and missing objects.
It has the capacity to alert camera operators to acts of vandalism and can also be used for number plate and facial recognition.
Cr McDonald said the advanced software had the potential to offer cost savings to Toowoomba Regional Council and other government agencies as well as providing data on how people use public facilities.
He said an analysis of the data had the potential to help the council improve its operations.
The number plate and facial recognition function has the potential to automate access to and from council’s facilities or assist emergency services to find missing people.
There are potential community safety benefits through automatic or early identification of anti-social behaviour and potential safety risks, as well as the identification of people who have left objects behind so they can be returned quickly and easily, according to Cr McDonald.
He said the council would demonstrate the use of the software to representatives from SEQ Mayors this month before reporting back to the council on the trial.
Cr McDonald said the trial advanced he council’s intention to capitalise on greater use of Smart Cities technology to improve service delivery across the region.
[…] CCTV trial in Queensland ringing privacy alarm bells […]