Bravely Peering Into A Strange, New World

NO one knows quite what to make of Worldcoin.

Signing up is as simple as downloading the World App in the Apple or Play store and creating a new account. And then they use an orb to scan your face and take pictures of your eyes…to generate a unique account just for you.

If this feels like cliched dystopian sci-fi, you’re not alone. What could go wrong? If you have seen Minority Report with Tom Cruise, a depiction of one of Philip K. Dick’s classics, you may have a few ideas. Or nightmares.

Jokes aside, the use case of the tech is proof-of-life or proof-of-personhood, to authenticate that you’re not a bot or an AI; meaning an account operated by an algorithm. Perhaps not useful now but something whose relevance may become more apparent in the coming years.

The iris scan is used to generate a Worldcoin ID unique to your ocular golf balls. Make sure you keep an eye on those.

The Plot Thickens

It is no coincidence that Sam Altman, core member of OpenAI is also one of the founders of Worldcoin. The success of OpenAI’s ChatGPT has ignited widespread belief that AI will soon be taking over — not just going for blue-collar jobs such as manufacturing — but heading much higher up the white-collar pyramid, even threatening creative jobs that were previously thought to be irreplaceable.

Much of this fear might be misplaced, it is more likely that AI will work alongside us, functioning as a time-saving tool to improve productivity. But it is inevitable that companies will become less crowded as many tasks become automated.

The use of generative AI and large language models (LLMs) is becoming more widespread. It will become much more difficult to know if a piece of writing or video has been made by humans or created or altered by an algorithm. Likewise, it will become increasingly difficult to distinguish between bot and human accounts on social media, a problem Elon Musk has been struggling with since his acquisition of Twitter (now X) earlier this year.

Hence the scanning of eyeballs. One account per pair. The machine learning algorithm will add your patterns to its model and will recognise you if you lose access to your account, or try to scan a second time, to reinforce the restriction of one account per person. Some would call that remembering.

Privacy Concerns

There are other ways to capture biometric data. DNA, for example. Those that have travelled to China may know that each person also has distinct earlobes, these must be clearly visible in visa application photos to visit the mainland, and photos are taken upon entry. According to Worldcoin, iris scanning offers the most accurate and data rich biometrics. Other methods such as FaceID with less data richness would only be able to accommodate a few million people.

The team admits that the Worldcoin (WRLD) token currently has no utility; furthermore, many VCs and other wealthy persons bought the tokens at a lower, ICO presale price, giving the project strong pump-and-dump vibes from the get-go.

However, the project aims to become as decentralised as possible; the Worldcoin protocol itself is open source, with the intention of encouraging developers to build on top of it and promote its integration across the digital payments and ID verification landscape.

Altruism Or Sinister Plot?

Some anticipate that the WRLD token will become a distribution mechanism for universal basic income (UBI) that may need to be distributed should the wealth gap widen further, or perhaps in response to some kind of global crisis, financial or otherwise.

The app also functions as a self-managed, non-custodial cryptocurrency wallet. The World App currently offers grants which can be claimed after you have taken part in the eyeball scanning ceremony. Free money for those brave enough to participate. Many have criticised the team for approaching those in countries with less stable domestic currencies to participate in the training of the algorithm, specifically countries such as Kenya where the US$25 welcome grant goes a long way.

Can you really trust them with your biometric data? They claim to delete it after creating your personal ID, but it’s difficult to completely delete anything from a memory chip. You wouldn’t want this to be one of those decisions that comes back to haunt you. If you’re feeling brave, make sure you at least verify the authenticity of the orb, which has a model number, and the scanner, who also must verify their ID to activate the orb for the account creation process.

But we must consider that numerous images of our faces are already on social media, perhaps your professional webpages such as LinkedIn or company websites, freely accessible to anyone with internet access. Not to mention surveillance images that keep track of us in all major cities, to keep us safe. What’s the harm in one more image of your face being taken?

See also  ChatGPT — Don't Believe Everything You Read

LEAVE A REPLY

Please enter your comment!
Please enter your name here