(LONDON) iProov Facial Recognition Report: Biometrics big business with security firms as one snaps up $70M for its facial verification technology, already in use by Homeland Security, the NHS and others #AceNewsDesk report

#AceNewsReport – Jan.07: The funding is coming from a single investor, Sumeru Equity Partners out of the Bay Area, which originally started life as a part of Silver Lake before spinning off as an independent operation in 2014. Valuation is not being disclosed, nor is the total raised by the company to date.

#AceSecurityDesk says according to a TechCrunch News Report: Biometrics, and specifically facial recognition, have seen a surge of usage in the last several years, first as a tool to help organizations verify identities digitally against rising waves of fraud and cybercrime; and second as a way to help enable that process even further in our socially-distanced, pandemic-punctuated times.

https://www.iproov.com/

Today, a startup called iProov, which provides face authentication and verification technology to a number of governments and other big organizations — attracting some controversy in the process — is announcing $70 million in funding to keep up its growth momentum.

London-based iProov has seen a lot of business traction so far in its home market of the U.K., and now it plans to use capital specifically to continue building out its present in the U.S. and other international markets where it’s already started to get a foothold. iProov works at the large enterprise level and its customer base currently includes U.S. Department of Homeland Security, the U.K.’s Home Office and National Health Services (NHS), the Australian Taxation Office, GovTech Singapore and banks Rabobank and ING. iProov said 2021 was a bumper year for the company: it tripled its revenues over the year before (although it’s not disclosing how much that works out to in actual terms).

As a measure of how much, and how, iProov is getting used, the NHS says that as of September 2021, usage of its NHS app — — which uses iProov to power the facial verification that is used to register for the app, which then lets you check and show your vaccination status; book doctor appointments; re-order prescriptions; view your medical records; get advice; and more — ballooned to 16 million users, from just 4 million in May 2021 (now it’s January and there are likely more).

To be clear, this isn’t facial recognition — which founder and CEO Andrew Bud describes as a mere “commodity” these days — but technology, sold as Genuine Presence Assurance and Liveness Assurance by iProov — that lets an organization capture an image of an individual, verify that it’s real against another piece of ID and not a deepfake or other counterfeit image, and proceed with whatever transaction is going on, all by way of cloud-based remote, virtual mobile technology.

Its peak usage last year last year typically would see iProov getting pinged for more than 1 million facial verifications per day.

But that growth has not come without scrutiny and other controversial attention.

Critics have slammed iProov and the UK government for a lack of transparency over how user data is handled in process of capturing and authenticating images for biometric verification, particularly given that iProov is a private company working for a public organization; related to that there have been other ethical questions raised between the linksbetween some of the startup’s earliest backers and the Tory Party (which is currently in power in the UK).

And as of this week (timed to coincide with the funding news?) iProov has also been the subject of a patent lawsuit from a U.S. rival called FaceTec, which claims that iProov has copied parts of its technology and is demanding an injunction (something that could be tricky as iProov increases its focus on the U.S.).

Meanwhile, iProov has also been involved in early work to see how and if its facial authentication technology might be applied in other use cases, such these trials to speed up Covid vaccination certification, another potential avenue for scrutiny.

In an interview, Bud was quick to counter the controversial currents that have swirled around his company and the technology that it’s built.

On the issue of privacy and security, Bud is a longtime veteran of the telecoms and mobile worlds, initially as an engineer and then an executive, who said that his interest in biometrics was sparked after being burned at his previous company, mBlox, where malicious hackers exploited the company’s SMS infrastructure and stole millions of dollars from customers.

The experience made him realize how critical security needed to be both at the end of the provider, but as something that was easy for consumers to engage with too. “It needed to be ultra-inclusive and simple,” Bud said. “How can we ensure something like that would never happen again? I had to solve that problem.” That, he said, was what spurred him to start looking at biometrics, which he believes is the best answer to that question. And from that he built his next company, which became iProov.

“These are fair questions,” he said in response to me raising the issue of privacy and data protection at iProov and its work with public and private institutions. “Privacy is extremely important to iProov and our systems are built to protect users.” Everything is compliant with GDPR or other government-mandated data protection rules, related to data and how it may or may not be used, he added, and the methods that iProov uses to process user data are built to keep customers and their identities safe from being compromised. He also confirmed that none of the data that passes through its system is used for commercial purposes. iProov runs a policy of not knowing the identities or other personal information realted to any photos, but it does store imagery, specifically to help track and block malicious actors and to track anomalies.

On the subject of the patent infringement lawsuit from FaceTec, Bud dismissed it as “completely unfounded,” with a spokesperson sending me a more complete statement after my interview (as well as asking we keep this part out of the story altogether…):

“All of our products have been developed in-house and are covered by granted patents. Accusations that we have used [FaceTec] technology in our products are completely unfounded, and iProov will take all appropriate actions to defend itself and its customers.”

And as for future applications, although the UK government hasn’t yet shown a willingness to mandate so-called “Covid passports” widely — where people have to provide quick verification of their vaccination status to gain entry to events, public venues, workplaces and more — the basics of that technology are already there and being used by a number of other customers, Bud said. These include a recent launch from Eurostar (which runs the train under the English Channel between London and cities on the European continent) for passengers to authenticate their various credentials at home, to reduce the amount of dwell time at check in, where they then can walk through simply by showing their faces to a screen.

Facial-related biometrics, Bud said, are likely to remain the mainstay of what iProov and others will develop going forward for these and similar use cases, although the company also offers a palm-based identification method, too. Primarily, however, iProov and others will have to follow the lead of the organizations they work for: their tech will only be as useful as whatn ever biometric information the original organization collects. (And these days, government-issued IDs, with photos, remain the main source of that data.)

As we move ever more processes to digital and cloud-based platforms, finding ever more watertight methods of verifying identities of users, while evading the increasingly sophisticated approaches of fraudsters and malicious hackers, will continue to be a huge priority. Investors seem willing to place bets on iProov being one of the strong players in keeping those services working as they should.

“We see iProov as becoming the industry standard to establish the genuine presence of anything (a person, a document etc),” said Kyle Ryland, a managing partner at Sumeru, in a statement to TechCrunch. “We hope that iProov will be used not only to accelerate digital onboarding and verification for both online and physical experiences, but also to replace the use of insecure passwords for frictionless authentication and much more. We have a platform that is constantly learning and allows us to remain at the forefront of emerging technologies and new security threats.”

#AceNewsDesk report …………..Published: Jan.07: 2022:

Editor says …Sterling Publishing & Media Service Agency is not responsible for the content of external site or from any reports, posts or links, and can also be found here on Telegram: https://t.me/acenewsdaily all of our posts fromTwitter can be found here: https://acetwitternews.wordpress.com/ and all wordpress and live posts and links here: https://acenewsroom.wordpress.com/and thanks for following as always appreciate every like, reblog or retweet and free help and guidance tips on your PC software or need help & guidance from our experts AcePCHelp.WordPress.Com

#acesecuritynews, #dhs, #facial-recognition, #london, #nhs, #technology, #washington

(MINNEAPOLIS) CBP REPORT: Customs and Border Protection at Minneapolis-St. Paul International Airport (MSP) are encouraging travelers to use the Global Entry Facial Recognition capabilities during their next flight #AceNewsDesk report

#AceNewsReport – Aug.29: When Global Entry members approach a kiosk with facial comparison capability, they will pause for a photo just as they do at existing kiosks. CBP will use biometric facial comparison technology to match the new photo against images that the member has already voluntarily provided to the government, such as passport and Global Entry enrollment photos. The kiosk will inform the traveler how to proceed, based on the results of the matching process.,,

#AceDailyNews reports that Global Entry is a program in which travelers volunteer to provide personally identifiable information and consent to CBP security vetting in return for expedited processing at U.S. airports and Preclearance locations. Participation in the program is open to U.S. citizens, nationals, and permanent residents as well as the citizens of 12 other countries. Citizens and residents of Canada who are NEXUS members are eligible to receive Global Entry benefits.

Passenger

Global Entry members will not be required to swipe their passports or submit fingerprints when using new or upgraded Global Entry kiosks. However, CBP will continue to require prospective members to provide passport information and fingerprints to CBP when applying to the program.

CBP is not creating a new inspection requirement or collecting new information through this process. The new process will only apply to Global Entry members and NEXUS members who receive Global Entry benefits. Global Entry and NEXUS are voluntary programs.

“The facial recognition software is a secure process that will enhance the customers experience while traveling,” said LaFonda Sutton-Burke, Director, Field Operations-Chicago. “I would encourage all travelers to take advantage of this technology to speed up their arrival process when traveling to the U.S.”

CBP takes its privacy obligations very seriously and is dedicated to protecting the privacy of all travelers. CBP has employed strong technical security safeguards and has limited the amount of personally identifiable information used in the new biometric process.

If for some reason the system cannot match the Global Entry member to an image on record, the system will simply revert to the existing process. The traveler would be prompted to swipe his/her passport and submit his/her fingerprints.

More information about measures that CBP is taking to protect traveler privacy can be found at: https://www.dhs.gov/publication/global-enrollment-system-ges.

#AceNewsDesk report ……Published: Aug.29: 2021:

Editor says …Sterling Publishing & Media Service Agency is not responsible for the content of external site or from any reports, posts or links, and can also be found here on Telegram: https://t.me/acenewsdaily all of our posts fromTwitter can be found here: https://acetwitternews.wordpress.com/ and all wordpress and live posts and links here: https://acenewsroom.wordpress.com/and thanks for following as always appreciate every like, reblog or retweet and free help and guidance tips on your PC software or need help & guidance from our experts AcePCHelp.WordPress.Com

#cbp, #facial-recognition, #minneapolis

(BEIJING) JUST IN: A camera system that uses AI and facial recognition intended to reveal states of emotion has been tested on Uyghurs in Xinjiang, the BBC has been told #AceNewsDesk report

#AceNewsReport – May.27: The Chinese embassy in London has not responded directly to the claims but says political and social rights in all ethnic groups are guaranteed:

CHINA: ‘AI emotion-detection software tested on Uyghurs: A software engineer claimed to have installed such systems in police stations in the province a human rights advocate who was shown the evidence described it as shocking’

9 hours ago

By Jane Wakefield
Technology reporter 

A gate of what is officially known as a "vocational skills education centre" in Xinjiang
A gate of what is officially known as a “vocational skills education centre” in Xinjiang

Xinjiang is home to 12 million ethnic minority Uyghurs, most of whom are Muslim.

Citizens in the province are under daily surveillance. The area is also home to highly controversial “re-education centres”, called high security detention camps by human rights groups, where it is estimated that more than a million people have been held. 

Beijing has always argued that surveillance is necessary in the region because it says separatists who want to set up their own state have killed hundreds of people in terror attacks.

Getty ImagesXinjiang is believed to be one of the most surveilled areas in the world

The software engineer agreed to talk to the BBC’s Panorama programme under condition of anonymity, because he fears for his safety. The company he worked for is also not being revealed. 

But he showed Panorama five photographs of Uyghur detainees who he claimed had had the emotion recognition system tested on them.Data from the system purports to indicate a person’s state of mind, with red suggesting a negative or anxious state of mind

“The Chinese government use Uyghurs as test subjects for various experiments just like rats are used in laboratories,” he said.

And he outlined his role in installing the cameras in police stations in the province: “We placed the emotion detection camera 3m from the subject. It is similar to a lie detector but far more advanced technology.”

He said officers used “restraint chairs” which are widely installed in police stations across China.

“Your wrists are locked in place by metal restraints, and [the] same applies to your ankles.”

He provided evidence of how the AI system is trained to detect and analyse even minute changes in facial expressions and skin pores.

According to his claims, the software creates a pie chart, with the red segment representing a negative or anxious state of mind.

He claimed the software was intended for “pre-judgement without any credible evidence”.

The Chinese embassy in London did not respond to questions about the use of emotional recognition software in the province but said: “The political, economic, and social rights and freedom of religious belief in all ethnic groups in Xinjiang are fully guaranteed.

“People live in harmony regardless of their ethnic backgrounds and enjoy a stable and peaceful life with no restriction to personal freedom.”

The evidence was shown to Sophie Richardson, China director of Human Rights Watch.

“It is shocking material. It’s not just that people are being reduced to a pie chart, it’s people who are in highly coercive circumstances, under enormous pressure, being understandably nervous and that’s taken as an indication of guilt, and I think, that’s deeply problematic.”

Suspicious behaviour

According to Darren Byler, from the University of Colorado, Uyghurs routinely have to provide DNA samples to local officials, undergo digital scans and most have to download a government phone app, which gathers data including contact lists and text messages.

“Uyghur life is now about generating data,” he said.

“Everyone knows that the smartphone is something you have to carry with you, and if you don’t carry it you can be detained, they know that you’re being tracked by it. And they feel like there’s no escape,” he said.

Most of the data is fed into a computer system called the Integrated Joint Operations Platform, which Human Rights Watch claims flags up supposedly suspicious behaviour.

“The system is gathering information about dozens of different kinds of perfectly legal behaviours including things like whether people were going out the back door instead of the front door, whether they were putting gas in a car that didn’t belong to them,” said Ms Richardson.

“Authorities now place QR codes outside the doors of people’s homes so that they can easily know who’s supposed to be there and who’s not.”

Orwellian?

There has long been debate about how closely tied Chinese technology firms are to the state. US-based research group IPVM claims to have uncovered evidence in patents filed by such companies that suggest facial recognition products were specifically designed to identify Uyghur people.

A patent filed in July 2018 by Huawei and the China Academy of Sciences describes a face recognition product that is capable of identifying people on the basis of their ethnicity.

Huawei said in response that it did “not condone the use of technology to discriminate or oppress members of any community” and that it was “independent of government” wherever it operated.

The group has also found a document which appears to suggest the firm was developing technology for a so-called One Person, One File system.

“For each person the government would store their personal information, their political activities, relationships… anything that might give you insight into how that person would behave and what kind of a threat they might pose,” said IPVM’s Conor Healy.

VCGHikvision makes a range of products including cameras

“It makes any kind of dissidence potentially impossible and creates true predictability for the government in the behaviour of their citizens. I don’t think that [George] Orwell would ever have imagined that a government could be capable of this kind of analysis.”

Huawei did not specifically address questions about its involvement in developing technology for the One Person, One File system but repeated that it was independent of government wherever it operated.

The Chinese embassy in London said it had “no knowledge” of these programmes.

IPVM also claimed to have found marketing material from Chinese firm Hikvision advertising a Uyghur-detecting AI camera, and a patent for software developed by Dahua, another tech giant, which could also identify Uyghurs.

Dahua said its patent referred to all 56 recognised ethnicities in China and did not deliberately target any one of them.

It added that it provided “products and services that aim to help keep people safe” and complied “with the laws and regulations of every market” in which it operates, including the UK.

Hikvision said the details on its website were incorrect and “uploaded online without appropriate review”, adding that it did not sell or have in its product range “a minority recognition function or analytics technology”.

Dr Lan Xue, chairman of China’s National committee on AI governance, said he was not aware of the patents.

“Outside China there are a lot of those sorts of charges. Many are not accurate and not true,” he told the BBC.

“I think that the Xinjiang local government had the responsibility to really protect the Xinjiang people… if technology is used in those contexts, that’s quite understandable,” he said.

The UK’s Chinese embassy had a more robust defence, telling the BBC: “There is no so-called facial recognition technology featuring Uyghur analytics whatsoever.”

Daily surveillance

Hu Liu feels his life is under constant surveillance

China is estimated to be home to half of the world’s almost 800 million surveillance cameras.

It also has a large number of smart cities, such as Chongqing, where AI is built into the foundations of the urban environment.

Chongqing-based investigative journalist Hu Liu told Panorama of his own experience: “Once you leave home and step into the lift, you are captured by a camera. There are cameras everywhere.”

“When I leave home to go somewhere, I call a taxi, the taxi company uploads the data to the government. I may then go to a cafe to meet a few friends and the authorities know my location through the camera in the cafe.

“There have been occasions when I have met some friends and soon after someone from the government contacts me. They warned me, ‘Don’t see that person, don’t do this and that.’

“With artificial intelligence we have nowhere to hide,” he said.

Find out more about this on Panorama’s Are you Scared Yet, Human? – available on iPlayer from 26 May

#AceNewsDesk report ……Published: May.27: 2021:

Editor says #AceNewsDesk reports by https://t.me/acenewsdaily and all our posts, also links can be found at here for Twitter and Live Feeds https://acenewsroom.wordpress.com/ and thanks for following as always appreciate every like, reblog or retweet and free help and guidance tips on your PC software or need help & guidance from our experts AcePCHelp.WordPress.Com

#ai, #beijing, #china, #facial-recognition, #london, #software