ACSOL’s Conference Calls

Conference Call Recordings Online
Dial-in number: 1-712-770-8055, Conference Code: 983459


Monthly Meetings | Recordings (10/16 Recording Uploaded)
Emotional Support Group Meetings

We have emailed a link to the conference videos to all attendees and those who purchased the videos. If you haven’t received it and it is not in your spam folder, email

conference at all4consolaws dot org

 

General News

Apple to scan U.S. iPhones for images of child sexual abuse

Apple is planning to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.

Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the center’s database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Other abuses could include government surveillance of dissidents or protesters. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography.

Apple has been under government pressure for years to allow for increased surveillance of encrypted data. Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

But a dejected Electronic Frontier Foundation, the online civil liberties pioneer, called Apple’s compromise on privacy protections “a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

Meanwhile, the computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcement to identify child pornography online, acknowledged the potential for abuse of Apple’s system but said it was far outweighed by the imperative of battling child sexual abuse.

“Is it possible? Of course. But is it something that I’m concerned about? No,” said Hany Farid, a researcher at the University of California at Berkeley, who argues that plenty of other programs designed to secure devices from various threats haven’t seen “this type of mission creep.” For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but also employs a system for detecting malware and warning users not to click on harmful links.

Apple was one of the first major companies to embrace “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured the company for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have lifesaving potential for children.”

Julia Cordua, the CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.” Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

But in a blistering critique, the Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon the changes, which it said effectively destroy the company’s guarantee of “end-to-end encryption.” Scanning of messages for sexually explicit content on phones or computers effectively breaks the security, it said.

The organization also questioned Apple’s technology for differentiating between dangerous content and something as tame as art or a meme. Such technologies are notoriously error-prone, CDT said in an emailed statement. Apple denies that the changes amount to a backdoor that degrades its encryption. It says they are carefully considered innovations that do not disturb user privacy but rather strongly protect it.

Separately, Apple said its messaging app will use on-device machine learning to identify and blur sexually explicit photos on children’s phones and can also warn the parents of younger children via text message. It also said that its software would “intervene” when users try to search for topics related to child sexual abuse.

In order to receive the warnings about sexually explicit images on their children’s devices, parents will have to enroll their child’s phone. Kids over 13 can unenroll, meaning parents of teenagers won’t get notifications.

Apple said neither feature would compromise the security of private communications or notify police.

Source: abcnews

We welcome a lively discussion with all view points - keeping in mind...  
  1. Your submission will be reviewed by one of our volunteer moderators. Moderating decisions may be subjective.
  2. Please keep the tone of your comment civil and courteous. This is a public forum.
  3. Swear words should be starred out such as f*k and s*t
  4. Please stay on topic - both in terms of the organization in general and this post in particular.
  5. Please refrain from general political statements in (dis)favor of one of the major parties or their representatives.
  6. Please take personal conversations off this forum.
  7. We will not publish any comments advocating for violent or any illegal action.
  8. We cannot connect participants privately - feel free to leave your contact info here. You may want to create a new / free, readily available email address.
  9. Please refrain from copying and pasting repetitive and lengthy amounts of text.
  10. Please do not post in all Caps.
  11. If you wish to link to a serious and relevant media article, legitimate advocacy group or other pertinent web site / document, please provide the full link. No abbreviated / obfuscated links.
  12. We suggest to compose lengthy comments in a desktop text editor and copy and paste them into the comment form
  13. We will not publish any posts containing any names not mentioned in the original article.
  14. Please choose a short user name that does not contain links to other web sites or identify real people
  15. Please do not solicit funds
  16. If you use any abbreviation such as Failure To Register (FTR), or any others, the first time you use it please expand it for new people to better understand.
  17. All commenters are required to provide a real email address where we can contact them.  It will not be displayed on the site.
  18. Please send any input regarding moderation or other website issues via email to moderator [at] all4consolaws [dot] org
ACSOL, including but not limited to its board members and agents, does not provide legal advice on this website.  In addition, ACSOL warns that those who provide comments on this website may or may not be legal professionals on whose advice one can reasonably rely.  
 
Subscribe
Notify of
28 Comments
Inline Feedbacks
View all comments

This is the slippery slope that technology has created. Under the guise of “protection” the public has given companies cart blanche to delve into our personal lives as authorities see fit in order to manipulate and control the flow of information. And who’s to say that information is input in order to get the results wanted. “Hey, this person already did this before, so let’s make them look guilty again. No one will challenge it, since that person has already been shown as ‘dangerous’ “.
They already have the means to follow you literally every second of your life. What you eat, watch, purchase, even your driving habits, are all recorded and categorized as safe or dissenting.
The only way to avoid this is to go ‘low tech’. This is one reason electronic payments are being promoted by companies and governments. Cash can’t be traced as easily. Older cars are being removed as quickly as possible in order to get new gps installed vehicles on the road. Has less to do with environment and more to do with oversight.
We have entered the age of conformity. Individual thought and expression has given way to mass manipulation and uniformity.
People will say, “ That just sounds like another conspiracy theory”. Well conformity isn’t a theory, it goes back to the beginning of civilization. Something as mundane as hairstyles, which were used to identify friend from enemy, was one of the first means of manipulation and control.
Now it is more nuanced. If you wear brighter clothes, your car is a different model, or your cellular phone isn’t the popular brand. I challenge you to let your kid wear a Fidel Castro t-shirt to school, or to send a text with the words ‘bomb’ or ‘terrorism’ in it. Guarantee somewhere there is a electronic file being made to note that, for the public’s protection.

Are those Fusion centers 1st amendment auditors talk about not wanting their information in those locations?

Doesn’t sound like this technology stops new sexual abuse from happening but simply sweeps up people unlucky enough to be recipients of the marked images. Only idiots would believe this helps save children from sexual abuse.

One more reason to use Signal as your default messaging app.

The problem with Signal is that messages, encrypted or not are still passed through Signal’s servers. Connections between devices and people can still be ascertained with the right amount of access.

Last edited 2 months ago by N.M.

Look for decentralized blockhain platforms that provide true encryption that even the companies themselves cannot access.

For file storage, one such company is called Internxt.

People should have freedom with and privacy.

This pathetic attempt at funneling more money into law enforcement ever-expanding annual budgets does NOTHING to stop actual abuse. This is all simply for government jobs expansion and job security. Nothing more.

If they can read what’s on our phones then they can also write onto our phones. This simply opens the door for planting fake evidence down the road. It will happen. Probably already does. This needs to be explained to people. ask your local programmer. Once you have access to the phone like that, you can do pretty much anything to it. I maybe trust Apple, but I do NOT trust governments. If Apple has this, then you know the FBI and NSA and foreign agencies have it or will have it very soon. Time to get a flip phone.

“This simply opens the door for planting fake evidence down the road. It will happen. Probably already does.”

Indeed, that ship has already sailed, amigo.

https://www.gizbot.com/mobile/news/pegasus-spyware-spotted-on-iphone-with-zero-click-imessage-exploit-everything-you-need-to-know-075518.html

You can’t just plant an image on someone’s phone without risking serious prison time.

Distribution of child pornography is a Federal crime carrying a mandatory minimum sentence. You think the feds will just let you get away with trying to frame someone? You’d do more time than your target.

Prosecution of a person, planting images on your phone, is not assured. 19 years ago I was involved in a bitter divorce; and was charged with allegedly having child porn. No porn was found on any of my confiscated devices; only on the one PC my ex-wife “turned in.” After I was found not guilty of that charge, she was not prosecuted because the DA said “it was not in the public interest,” because it would discourage legitimate complaints. My only consolation is the knowledge that, for the last 19 years, she has been spending 8 to 12 hours a day of her life, working as an LPN (Low Paid Nurse) in the hospice wing of a federal corrections facility in NM.

Ironically, Apple will be “requesting” copies of the suspected images/files. Does the person who has possession suddenly also get charged with distribution? Does Apple get charged with possession?

@N.M.

Does the FBI get this treatment on the dark web when they do the same or is Apple acting as an agent for favorable treatment in exchange for the suspicious visual data?

Speaking as an InfoSec engineer, what Apple is doing is no different than what Dropbox already does. What makes Apple’s approach “different” is that they are performing “edge computing,” in which the image hashing and comparisons are being done on that expensive phone you paid for. Plus, they are using your data plan to download updated CSAM hash lists to your phone. I think that one could create a legal argument, that Apple is using your phone and data plan, without adequate compensation.

Cybersecurity researcher here. Lots of angles to this. Stepping back from all the common arguments for or against this I have to ask myself, what’s the REAL reason this is suddenly being done? This has nothing to do with finding and reporting CP. My guess at this point is to open a path for mass deletion of content they wish to be able to delete.

Mass deletion of child pornography would be a good thing, no? I mean, there is so much of this content out there, and so far few solutions to it.

@N.M.

If deleting is the goal, then they are possibly being manipulated by the USG to delete anything they feel could go against a narrative to prevent spreading of it that certain people feel is objectionable. Sounds like censorship to me.

It’s just not the “cloud” they will target. The algorithms will also scan “client side” (your actual physical phone HD” for untoward content. This is just a camel nose for something even more insidious down the road.

This is not a “camel nose”, this is the whole camel being inside your phone.

Personally, I’m not interested in owning any devices or using any services that spy on me for big government. This is just a great reminder, just like with the Oppression Lists (OLs), that if you are going to be doing anything that you don’t want others to know about, you really need to understand how to do it properly.

This intrusion, just like the Law Enforcement Criminals’ stings, likely won’t catch anyone of significance. It will catch some people who don’t know what they are doing or people just accidentally getting into the wrong places. The real predators will be better. They already know all about internet-connected devices, metadata, encryption, and all that. And of course, IF they are listed on the OLs, IF, that is only encouraging them to commit crimes.

So much for authorizing the invasion of privacy, just because of cp.

What about the 5th Amendment? Seems to me that a warrant would be required in order to do this. What happens when they don’t find what they’re looking for, but they find something else instead? I doubt it would be admissible as evidence in court. There are other privacy issues as well. And we already know how hackers work. Most people keep sensitive information on their phones like banking apps and saved passwords.
Its one thing when the government reaches out like this, but when a private corporation does it, its even worse.

@Disgusted

I believe this falls under Apple’s TOS page and what is acceptable and open to their ability as a private company to do. I’d even venture this is akin to Best Buy reporting the same when someone brings a computer in for servicing when they find matter they feel needs to be reported to LE for investigating.

I don’t know. If you bought the phone and own it outright, seems like its your property and they have no right to search what you have on it. A person could have easily downloaded illegal images using public wifi, so technically that person was not using Apple’s service at that time. I don’t condone CP, but people still have constitutional rights against unlawful search and seizure.
I don’t see a similarity when giving your laptop to some repairman at BestBuy for repairs, because you physically handed over the property. If I’m walking down the sidewalk and not doing anything suspicious, but have counterfeit money in my pocket when a cop asks me what I’m doing, he has no right to search me without reasonable and articulate suspicion that I am engaging in criminal activity.
I’m all about Constitutional rights.

Apple is just doing this to keep the federal government off their back the feds are still mad about those 2 terrorist in San Bernardino County who shot up that regional medical center the feds are just using their KEEPING CHILDREN SAFE angle to subpoena/bully apple into opening their phones during any federal investigations and it works every time.
Apple’s not stupid nobody wants problems with the U.S federal government their one of the most powerful organizations in the world

Good luck

I agree with @AERO1 on this point about keeping the Feds off their back through a closed door session or two. I don’t agree overall with what Tim Cook and Apple is doing in a policing action, but as a private company, they can have their standards which should be under the TOS page for all to read and acknowledge with a box. Never have been an Apple fan, even during the heydays of green monochromatic CRT screens, and this will most assuredly ensure the anti-Apple crowd may increase in size.

While the experts may not have seen other “mission creep” in this area of technology, they should not assume it has not happened either. FISA court releases showed a lot more well after missions were completed, so mission creep was not known until much later (which is usual). Experts are not always so smart, e.g. Dr. Hansen.

If I heard right on the national news they were saying that the photos would be scanned when they were uploaded from your iPhone to the iCloud. With that, I think they’re saying that the iCloud is the property of a PPLE and not the owner of the phone.

Only in the US? And again FBI identified photos….so just more, after the act response. If they can do this after why not stop it before. Why can’t they put in a filter to apps/phones/programs so you can’t get those images in the first place. This does nothing to stop the flow if people really want it. It does absolutely nothing to stop child abuse.

If this story is even true, it’s encouraging that some people at Apple are a little privacy conscious. It’s not hard to imagine the government or vindictive spouse planting incriminating evidence in someone’s iCloud account.

https://www.the-sun.com/news/3464672/apple-revolt-tech-scan-customers-iphones-child-abuse-privacy/

Today’s podcast of “The Daily”, by The New York Times was about Apple and their effort to crack down on child abuse images. Look for the podcast called “The Daily.”

28
0
Would love your thoughts, please comment.x
()
x
.