1. One thing that helpsTake It DownFinally I have something positive to report about Meta! Thanks Zucks!Take It Down is a new service from The National Centre for Missing & Exploited Children in the United States and Meta has just joined. The service enables underage people to get sexually explicit photos of themselves removed from the internet.It works like this:
Anyone under the age of 18 (or people who are concerned about pictures of themselves when they were underage) is welcome to use the new site. Major websites signed up to the service are Facebook, Instagram, OnlyFans, Yubo and Pornhub.Although the action is a positive step, Julie Inman Grant, Australia’s eSafety Commissioner, has asked social media behemoths like Meta to take action and not wait for user reports. As Michael Salter, an associate professor of criminology at the University of New South Wales, says, “We are not going to see a reduction in child sexual abuse material online until major service providers are required to proactively scan for and seek the removal of child sexual abuse material.”Take It Down is certainly the first step in giving victims an anonymous and accessible tool against child sexual exploitation.2. One to be wary ofVoice Clone BypassIt’s AI versus AI.Journalist Joseph Cox of Vice, has highlighted the issues with the “my voice is my password” approach to security. Cox was able to trick Lloyds Bank’s vocal biometric system into granting him access to his own account. He did this by creating a convincing clone of his voice using an inexpensive AI voice cloning tool.Banks use voice biometrics to confirm account holders over the phone. This experiment has proved that these systems can be compromised using AI voice cloning technology. Lloyds Bank states that their “secure” Voice ID security feature examines over “100 different aspects of your voice, which like your fingerprint, are unique to you.” Even though voice cloning fraud may not be a common problem right now, you shouldn’t rely on it as your only line of security. Same goes for facial recognition – sorry everyone!Differentiating between real and false voices will get harder as technology advances, posing new problems for society. It’s critical that people and organizations are aware of these possible threats and take precautions on either side.3. One to amaze“Replay Memories”Wist: Immersive Memories is something that I have been reading about in my nerdy sci-fi novels for some time now. Do you remember that scene from Minority Report where Tom Cruise is immersed in the memories he has with his deceased wife and kid? This is now a real possibility with the ability to transform regular videos (and let's face it, our entire lives are on video now) into an immersive VR memory experience. I really am torn between this being creepy and weird or absolutely awesome. [embed]https://www.youtube.com/watch?v=Ey6s6-yofJU[/embed]Wist takes a video of your favourite memory then uses AR (augmented reality) and VR (virtual reality) to “replay” that video in real life. It turns your 2D video into 3D that you can relive any time you wish. It also allows you to invite friends to relive the moment together! Of course to do this, you’ll need a VR headset, and for now, the app is invite-only. The videos also tend to break apart and lose their edges a little, but as this tech improves it will no doubt become near indistinguishable from reality.This may just become the next generation slideshow. “Hey, come and sit through a full-length revisiting of my exciting holiday to Hawaii”. Or for those friends you’re a bit closer to (I’m loath to even say it after the first topic in this newsletter) I’m sure the adult movie industry is paying close attention.