Entertainment

How one California man scammed his way into thousands of Apple iCloud accounts to steal nude images


A 40-year-old is accused of hacking and stealing 620,000 iCloud photos from users to search for nude images of women after pretending to be Apple tech support in emails. 

Hao Kuo Chi, 40, of La Puente, California, pleaded guilty to one count of conspiracy and three counts of gaining unauthorized access to a protected computer, according to court records. 

He reached an estimated 620,000 photos of at least 306 victims, the majority of them being young women, the FBI estimated. He reached 200 of his victims online after he marketed himself as someone who could hack into iCloud accounts. He used the user name ‘iclouddripper4you.’ 

The FBI said it found two Gmail addresses attached to the user names Chi used during his scam – ‘applebackupicloud’ and ‘backupagenticloud’ – with more than 500,000 emails and 4,700 iCloud IDs and passwords.

Hao Kuo Chi, 40, of La Puente, California, hacked into 306 Apple users’ iCloud accounts in search of nude images and videos of young women

‘Customers’ would request he hack into certain iCloud accounts and he responded with a Dropbox link, according to a court statement by FBI agent Anthony Bossone. 

His Dropbox account contained roughly 620,000 images and 9,000 videos from his customers. 

The content was organized, in part, based on which item contained nude images.  

He communicated with his unnamed co-conspirators via foreign encrypted emails and considered finding a nude image ‘a win’, authorities said. They collected and shared the nude images and videos with one another. 

The FBI estimated that Chi had more than 620,000 images and 9,000 videos from his victims and he and his unnamed co-conspirators considering finding nude images a 'win.' Chi pleaded guilty to one count of conspiracy and three counts of gaining unauthorized access to a protected computer

The FBI estimated that Chi had more than 620,000 images and 9,000 videos from his victims and he and his unnamed co-conspirators considering finding nude images a ‘win.’ Chi pleaded guilty to one count of conspiracy and three counts of gaining unauthorized access to a protected computer

He admitted he didn’t even ‘know who was involved,’ Chi told the Los Angeles Times

The scam started to fall apart in March 2018 when a popular California company, known for removing celebrity images from the internet, notified an unnamed public figure in Tampa, Florida, about their nude images appearing on pornographic sites. 

The images were stored on their iPhone and had been uploaded to iCloud. 

Bossone reported investigators later discovered the victim’s iCloud had been accessed from Chi’s home.

The FBI got a search warrant and raided Chi’s home on May 19. 

Before this, investigators found Chi’s Dropbox, Apple, Google and Facebook accounts, and had a record of his online activities.   

How avoid scams as an Apple user

With the increase in data breaches and the need to protect one’s privacy, identity, and information, users cannot solely rely on Apple to protect them. 

Here’s how to protect yourself from scams:

  • Never share credit card or personal information unless the recipient is verified 
  • Use two-factor authentication
  • Never share Apple ID information or verification codes 
  • Be careful of suspicious emails, phone calls, or text messages 
  • Don’t interact with pop-up boxes offering free prizes or prompting you to download software  

 Source: Apple 

On August 5, Chi pleaded guilty to four charges and faces up to five years in prison for each charge. 

In a similar 2007 instance, several Geek Squad members from around the country admitted they saw co-workers saving users’ personal photos and videos onto DVDs. 

Former Geek Squad member Brett Haddock told the Baltimore Sun at the time, ‘Any attractive young woman who drops off her computer with the Geek Squad should assume that her photos will be looked at.’  

Apple has faced backlash for years over the ongoing privacy and security issues with user information. 

Recently, an Israel NSO Group delivered malware directly to users’ phone via text, bypassing Apple’s security features. 

Pegasus, their surveillance tool, was able to collect emails, call records, social media posts, user passwords, contact lists, pictures, videos and more from 23 users. 

It also could activate microphones and cameras, and collect fresh data, including location, without the user interacting or knowing it was on their phone. 

More than 50,000 phone numbers from more than 50 countries had been collected, according to the Washington Post

This damaged Apple’s reputation of being secure and safe for its users.  

Detecting child pornography and app tracking: What Apple is doing to protect its users’ information and privacy 

Over the years, Apple has been actively adding more security features to its products to protect its users’ identities and information.  

Apple is planning to implement a software to scan MacBooks, iPhones, and iPads for child pornography in its upcoming iOS15 and macOS Monterey update. 

It is implementing a system called child sexual abuse material (CSAM). 

This detection system will operate as surveillance and provide law enforcement with valuable information about child sexual abuse. 

It will scan iMessages and use pop-up boxes and use on-device machine learning to alert users of sensitive content, while keeping communications private from Apple.

In Apple's upcoming iOS15 update, it will send notifications to children who receive sexually explicit photos in the messages app and will alert parents if they choose to view it. It only works on a family account

In Apple’s upcoming iOS15 update, it will send notifications to children who receive sexually explicit photos in the messages app and will alert parents if they choose to view it. It only works on a family account

When a child receives or sends sexually explicit content, the update will blur the image and give users viewing options. If a child chooses to view or send a sexually explicit image, they will be alerted beforehand that the parent will receive a message alerting them that their child has viewed the image.

This feature can only be used for those set up on a family account.  

It will also scan photos and intervene if users are searching for CSAM images. 

Apple will be enabled to report sexually explicit photos detected to National Center for Missing and Exploited Children (NCMEC). 

The company will also alert uses if they search for CSAM related material. These new features will allow Apple to notify NCMEC to help local authorities

The company will also alert uses if they search for CSAM related material. These new features will allow Apple to notify NCMEC to help local authorities 

In a previous iOS14.5 update, Apple users will now receive pop-up notifications about app tracking. 

App Tracking Transparency now requires app developers to ask users if they would like their information to be tracked in new pop-ups.

Users can opt in to this by going to the tracking option in Privacy settings and making sure ‘Allow Apps to Request to Track’ is turned on.    

Source: Apple, Business Insider 



Source link

Related Articles

Back to top button