> Google’s hash match may well have established probable cause for a warrant to allow police to conduct a visual examination of the Maher file.
Very reasonable. Google can flag accounts as CP, but then a judge still needs to issue a warrant for the police to actually go and look at the file. Good job court. Extra points for reasoning about hash values.
> a judge still needs to issue a warrant for the police to actually go and look at the file
Only in the future. Maher's conviction, based on the warrantless search, still stands because the court found that the "good faith exception" applies--the court affirmed the District Court's finding that the police officers who conducted the warrantless search had a good faith belief that no warrant was required for the search.
I wonder what happened to fruit of the poisoned tree? Seems a lot more liberty oriented than "good faith exception" when police don't think they need a warrant (because police never seem to "think" they need a warrant).
I'm trying to imagine a more "real-world" example of this to see how I feel about it. I dislike that there is yet another loophole to gain access to peoples' data for legal reasons, but this does feel like a reasonable approach and a valid goal to pursue.
I guess it's like if someone noticed you had a case shaped exactly like a machine gun, told the police, and they went to check if it was registered or not? I suppose that seems perfectly reasonable, but I'm happy to hear counter-arguments.
The main factual components are as follows: Party A has rented out property to Party B. Party A performs surveillance on or around the property with Party B's knowledge and consent. Party A discovers very high probability evidence that Party B is committing crimes within the property, and then informs the police of their findings. Police obtain a warrant, using Party A's statements as evidence.
The closest "real world" analogy that comes to mind might be a real estate management company uses security cameras or some other method to determine that there is a crime occurring in a space that they are renting out to another party. The real estate management company then sends evidence to the police.
In the case of real property -- rental housing and warehouse/storage space in particular -- this happens all the time. I think that this ruling is imminently reasonable as a piece of case law (ie, the judge got the law as it exists correct). I also thing this precedent would strike a healthy policy balance as well (ie, the law as it exists if interpreted how the judge in this case interprets it would a good policy situation).
I think the real-world analogy would be to say that the case is shaped exactly like a machine gun and the hotel calls the police, who then open the case without a warrant. The "private search" doctrine allows the police to repeat a search done by a private party, but here (as in the machine gun case), the case was not actually searched by a private party.
But this court decision is a real world example, and not some esoteric edge case.
This is something I don’t think needs analogies to understand. SA/CP image and video distribution is an ongoing moderation, network, and storage issue. The right to not be under constant digital surveillance is somewhat protected in the constitution.
I like speech and privacy and am paranoid of corporate or government overreach, but I arrive at the same conclusion as you taking this court decision at face value.
Don't they?. If you tell the cops that your neighbor has drugs of significant quantity in their house, would they not still need a warrant to actually go into your neighbor's house?
There are a lot of nuances to these situations of third-party involvement and the ruling discusses these at length. If you’re interested in the precise limits of the 4th amendment you should really just read the linked document.
they should as a matter of course. but I guess "papers" you entrust to someone else are a gray area. I personally think that it goes against the separation of police state and democracy, but I'm a nobody, so it doesn't matter I suppose.
Is it reasonable? Even if the hash was md5, given valid image files, the chances of it being an accidental collision are way lower than the chance of any other evidence given to a judge was false or misinterpreted.
This is NOT a secure hash. This is an image similar to hash which has many many matches in not related images.
Unfortunately the decision didn't mention this at all even though it is important. If it was even as good as a md5 hash (which is broken) I think the search should be allowed without warrant because even though a accidental collision is possible odds are so strongly against it that the courts can safely assume there isn't (and of course if there is the police would close the case). However since this has is not that good the police cannot look at the image unless Google does.
It seems like a large part of the ruling hinges on the fact that Google matched the image hash to a hash of a known child pornography image, but didn't require an employee to actually look at that image before reporting it to the police. If they had visually confirmed it was the image they suspected it was based on the hash then no warrant would have been required, but the judge reads that the image hash match is not equivalent to a visual confirmation of the image. Maybe there's some slight doubt in whether or not the image could be a hash collision, which depends on the hash method. It may be incredibly unlikely (near impossible?) for any hash collision depending on the specific hash strategy.
I think it would obviously be less than ideal for Google to require an employee visually inspect child pornography identified by image hash before informing a legal authority like the police. So it seems more likely that the remedy to this situation would be for the police to obtain a warrant after getting the tip but before requesting the raw data from Google.
Would the image hash match qualify as probable cause enough for a warrant? On page 4 the judge stops short of setting precedence on whether it would have or not. Seems likely that it would be a solid probable cause to me, but sometimes judges or courts have a unique interpretation of technology that I don't always share, and leaving it open to individual interpretation can lead to conflicting results.
The hashes involved in stuff like this, as with copyright auto-matching, are perceptual hashes (https://en.wikipedia.org/wiki/Perceptual_hashing), not cryptographic hashes. False matches are common enough that perceptual hashing attacks are already a thing in use to manipulate search engine results (see the example in random paper on the subject https://gangw.cs.illinois.edu/PHashing.pdf).
It seems like that is very relevant information that was not considered by the court. If this was a cryptographic hash I would say with high confidence that this is the same image and so Google examined it - there is a small chance that some unrelated file (which might not even be a picture) matches but odds are the universe will end before that happens and so the courts can consider it the same image for search purposes. However because there are many false positive cases there is reasonable odds that the image is legal and so a higher standard for search is needed - a warrant.
That makes sense - if they were using a cryptographic hash then people could get around it by making tiny changes to the file. I’ve used some reverse image search tools, which use perceptual hashing under the hood, to find the original source for art that gets shared without attribution (saucenao pretty solid). They’re good, but they definitely have false positives.
Now you’ve got me interested in what’s going on under the hood, lol. It’s probably like any other statistical model: you can decrease your false negatives (images people have cropped or added watermarks/text to), but at the cost of increased false positives.
This submission is the first I've heard of the concept. Are there OSS implementations available? Could I use this, say, to deduplicate resized or re-jpg-compressed images?
The hash functions used for these purposes are usually not cryptographic hashes. They are "perceptual hashes" that allows for approximate matches (e.g. if the image has been scaled or brightness-adjusted). https://en.wikipedia.org/wiki/Perceptual_hashing
> Maybe there's some slight doubt in whether or not the image could be a hash collision, which depends on the hash method. It may be incredibly unlikely (near impossible?) for any hash collision depending on the specific hash strategy.
If it was a cryptographic hash (apparently not), this mathematical near-certainty is necessary but not sufficient. Like cryptography used for confidentiality or integrity, the math doesn't at all guarantee the outcome; the implementation is the most important factor.
Each entry in the illegal hash database, for example, relies on some person characterizing the original image as illegal - there is no mathematical formula for defining illegal images - and that characterization could be inaccurate. It also relies on the database's integrity, the user's application and its implementation, even the hash calculator. People on HN can imagine lots of things that could go wrong.
If I were a judge, I'd just want to know if someone witnessed CP or not. It might be unpleasant but we're talking about arresting someone for CP, which even sans conviction can be highly traumatic (including time in jail, waiting for bail or trial, as a ~child molestor) and destroy people's lives and reputations. Do you fancy appearing at a bail hearing about your CP charge, even if you are innocent? 'Kids, I have something to tell you ...'; 'Boss, I can't work for a couple weeks because ...'.
It seems like there just needs to be case law about the qualifications of an image hash in order to be counted as probable cause for a warrant. Of course you could make an image hash be arbitrarily good or bad.
I am not at all opposed to any of this "get a damn warrant" pushback from judges.
I am also not at all opposed to Google searching it's cloud storage for this kind of content. There are a lot of things I would mind a cloud provider going on fishing expeditions to find potentially illegal activity, but this I am fine with.
I do strongly object to companies searching content for illegal activity on devices in my possession absent probable cause and a warrant (that they would have to get in a way other than searching my device). Likewise I object to the pervasive and mostly invisible delivery to the cloud of nearly everything I do on devices I possess.
In other words, I want custody of my stuff and for the physical possession of my stuff to be protected by the 4th amendment and not subject to corporate search either. Things that I willingly give to cloud providers that they have custody of I am fine with the cloud provider doing limited searches and the necessary reporting to authorities. The line is who actually has the bits present on a thing they hold.
I think if the hashes were made available to the public, we should just flood the internet with matching but completely innocuous images so they can no longer be used to justify a search
>please use the original title, unless it is misleading or linkbait; don't editorialize. (@dang)
On topic, I like this quote from the first page of the opinion:
>A “hash” or “hash value” is “(usually) a short string of characters generated from a much larger string of data (say, an electronic image) using an algorithm—and calculated in a way that makes it highly unlikely another set of data will produce the same value.” United States v. Ackerman, 831 F.3d 1292, 1294 (10th Cir. 2016) (Gorsuch, J.).
It's amusing to me that they use a supreme court case as a reference for what a hash is rather than eg. a textbook. It makes sense when you consider how the court system works but it is amusing nonetheless that the courts have their own body of CS literature.
Maybe someone could publish a "CS for Judges" book that teaches as much CS as possible using only court decisions. That could actually have a real use case when you think of it. (As other commenters pointed out, the hashing definition given here could use a bit more qualification, and should at least differentiate between neural hashes and traditional ones like MD5, especially as it relates to the likeliness that "another set of data will produce the same value." Perhaps that could be an author's note in my "CS for Judges" book.)
> Maybe someone could publish a "CS for Judges" book
At last, a form of civic participation which seems both helpful and exciting to me.
That said, I am worried that lot of necessary content may not be easy to introduce with hard precedent, and direct advice or dicta might somehow (?) not be permitted in a case since it's not adversarial... A new career as a professional expert witness--even on computer topics--sounds rather dreary.
What's so weird about this? CS literature is not legally binding in any way. Of course a judge would rather quote a previous ruling by fellow judge than a textbook, Wikipedia, or similar sources.
From what I understand, a judge is free to decide matters of fact on his own, which could include from a textbook. Also, it is not clear that matters of fact decided by the Supreme Court are binding to lower courts. Additionally, facts and even meanings of words themselves can change, which makes previous findings of fact no longer applicable. That's actually true in this case as well. "Hash" as used in the context of images generally meant something like an MD5 hash (which itself is now more prone to collisions than before). The "hash" in the Google case appears to be a perceptual hash, which I don't think was as commonly used until recently (I could be wrong here). So whatever findings of fact were made by the Supereme Court about how reliable a hash is is not necessarily relevant to begin with. Looking at this specific case, here is the full quote from United States v. Ackerman:
>How does AOL's screening system work? It relies on hash value matching. A hash value is (usually) a short string of characters generated from a much larger string of data (say, an electronic image) using an algorithm—and calculated in a way that makes it highly unlikely another set of data will produce the same value. Some consider a hash value as a sort of digital fingerprint. See Richard P. Salgado, Fourth Amendment Search and the Power of the Hash, 119 Harv. L. Rev. F. 38, 38-40 (2005). AOL's automated filter works by identifying the hash values of images attached to emails sent through its mail servers.[0]
I don't have access to this issue of Harvard Law Review but looking at the first page, it says:
>Hash algorithms are used to confirm that when a copy of data is made, the original is unaltered and the copy is identical, bit-for-bit.[1]
This is clearly referring to a cryptographic hash like MD5, not a perceptual hash/neural hash as in Google. So the actual source here is not necessarily dealing with the same matters of fact as the source of the quote here (although there could be valid comparisons between them).
All this said, judges feel more confident in citing a Supreme Court case than a textbook because 1. it is easier to understand for them 2. the matter of fact is then already tied to a legal matter, instead of the judge having to make that leap himself and also 3. judges are more likely to read relevant case law to begin with since they will read it to find precedent in matters of law – which are binding to lower courts. This is why a "CS for Judges" could be a useful reference book.
Lastly, I should have looked a bit more closely at the quoted case. This is actually not a supreme court case at all. Gorsuch was nominated in 2017 and this case is from 2016.
> As the district court correctly ruled in the alternative, the good faith exception to the exclusionary rule supports denial of Maher’s suppression motion because, at the time authorities opened his uploaded file, they had a good faith basis to believe that no warrant was required
So this means this conviction is upheld but future convictions may be overturned if they similarly don't acquire a warrant?
> the good faith exception to the exclusionary rule supports denial of Maher’s suppression motion because, at the time authorities opened his uploaded file, they had a good faith basis to believe that no warrant was required
This "good faith exception" is so absurd I struggle to believe that it's real.
Ordinary citizens are expected to understand and scrupulously abide by all of the law, but it's enough for law enforcement to believe that what they're doing is legal even if it isn't?
What that is is a punch line from a Chapelle bit[1], not a reasonable part of the justice system.
The courts accept good faith arguments at times. They will give reduced sentences or even none at all if they think you acted in good faith. There are enough situations where it is legal to kill someone that there are laws to make it clear that is a legal situation where one person can kill another (hopefully they never apply to you).
Note that this case is not about ignorance of the law. This is I knew the law and was trying to follow it - I just honestly thought it didn't apply because of some tricky situation that isn't 100% clear.
“Mens rea” is a key component of most crimes. Some crimes can only be committed if the perpetrator knows they are doing something wrong. For example, fraud or libel.
At the time, what they did was assumed to be legal because no one had ruled on it.
Now, there is prior case law declaring it illegal.
The ruling is made in such a way to say “we were allowing this, but we shouldn’t have been, so we wont allow it going forward”.
I am not a legal scholar, but that’s the best way I can explain it. The way that the judicial system applies to law is incredibly complex and inconsistent.
This is a deeply problematic way to operate. En masse, it has the right result, but, for the individual that will have their life turned upside down, the negative impact is effectively catastrophic.
This ends up feeling a lot like gambling in a casino. The casino can afford to bet and lose much more than the individual.
This specific conviction upheld, yes. But no, this ruling doesn't speak to whether or not any future convictions may be overturned.
It simply means that at the trial court level, future prosecutions will not be able to rely on the good faith exception to the exclusionary rule if warrantless inculpatory evidence is obtained under similar circumstances. If the governement were to try to present such evidence at trial and the trial judge were to admit it over the objection of the defendant, then that would present a specific ground for appeal.
This ruling merely bolsters the 'better to get a warrant' spirit of the Fourth Amendment.
It's crazy that the most dangerous people one regularly encounters can do anything they want as long as they believe they can do it. The good faith exemption has to be one of the most fascist laws on the books today.
> "the good faith exception to the exclusionary rule supports denial of Maher’s suppression motion because, at the time authorities opened his uploaded file, they had a good faith basis to believe that no warrant was required."
In no other context or career can you do anything you want and get away with it just as long as you say you thought you could. You'd think police offiers would be held to a higher standard, not no standard.
And specifically with respect to the law, breaking a law and claiming you didn't know you did anything wrong as an individual is not considered a valid defense in our justice system. This same type of standard should apply even more to trained law enforcement, not less, otherwise it becomes a double standard.
No this is breaking the law by saying this looked like one of the situations where I already know the law doesn't apply. If Google had looked at the actual image and said it was child porn instead of just saying it was similar to some image that is child porn this would be 100% legal as the courts have already said. That difference is subtle enough that I can see how someone would get it wrong (and in fact I would expect other courts to rule differently)
That's not what this means. One can ask whether the belief is reasonable, that is justifiable by a reasoning process. The argument for applying the GFE in this case is that the probability of false positives from a perceptual hash match is low enough that it's OK to assume it's legit and open the image to verify that it was indeed child porn. They then used that finding to get warrants to search the guy's gmail account and later his home.
If I'm not a professional and I hurt someone while trying to save their life by doing something stupid, that's understandable ignorance.
If a doctor stops to help someone and hurts them because the doctor did something stupid, that is malpractice and could get them sued and maybe get their license revoked.
Would you hire a programmer who refused to learn how to code the claimed "good faith" every time they screwed things up? Good faith shouldn't cover willful ignorance. A cop is hired to know, understand, and enforce the law. If they can't do that, they should be fired.
It's not exactly the same imo, since GS laws are meant to protect someone who is genuinely trying to do what a reasonable person could consider "something positive"
In this case you're correct. But the good faith exemption is far broader than this and applies to even officer's completely personal false beliefs in their authority.
I think the judge chose to relax a lot on this one due to the circumstances. Releasing a man in society found with 4,000 child porn photos in his computer would be a shame.
But yeah, this opens too wide gates of precedence for tyranny, unfortunately...
> It feels like it incentivizes the police to minimize their understanding of the law so that they can believe they are following it.
That's a bingo. That's exactly what they do, and why so many cops know less about the law than random citizens. A better society would have high standards for the knowledge expected of police officers, including things like requiring 4-year criminal justice or pre-law degree to be eligible to be hired, rather than capping IQ and preferring people who have had prior experience in conducting violent actions.
> It feels like it incentivizes the police to minimize their understanding of the law so that they can believe they are following it.
It is not so lax as that. It's limited to situations where a reasonable person who knows exactly what the law and previous court rulings say might conclude that a certain action is legal. In this case, other Federal Circuit courts have ruled that similar actions are legal.
The good faith exception requires the belief be reasonable. Ignorance of clearly settled law is not reasonable, it should be a situation where the law was unclear, had conflicting interpretations or could otherwise be interpreted the way the police did by a reasonable person.
The problem with the internet nowadays is that a few big players are making up their own law. Very often it is against local laws, but nobody can fight with it. For example someone created some content but other person uploaded it and got better scores which rendered the original poster blocked. Another example: children were playing a violin concert and the audio got removed due to alleged copyright violation. No possibility to appeal, nobody sane would go to court. It just goes this way...
The harshness of sentence is not for the action of keeping the photos in itself, but the individual suffering and social damage caused by the actions that he incentivizes when he consumes such content.
Consumption per se does not incentivize it, though; procurement does. It's not unreasonable to causally connect one to the other, but I still think that it needs to be done explicitly. Strict liability for possession in particular is nonsense.
There's also an interesting question wrt simulated (drawn, rendered etc) CSAM, especially now that AI image generators can produce it in bulk. There's no individual suffering nor social damage involved in that at any point, yet it's equally illegal in most jurisdictions, and the penalties aren't any lighter. I've yet to see any sensible arguments in favor of this arrangement - it appears to be purely a "crime against nature" kind of moral panic over the extreme ickiness of the act as opposed to any actual harm caused by it.
Assuming the person is a passive consumer with no messages / money exchanged with anyone, it is very hard to prove social harm or damage. Sentences should be proportional to the crime. Treating possession of cp as equivalent of literally raping a child just seems absurd to me. IMO, just for the legal protection of the average citizen, a simple possession should never warrant jail time.
It's a reasonable argument, but a concerning one because it hinges on a couple of layers of indirection between the person engaging in consuming the content and the person doing the harm / person who is harmed.
That's not outside the purview of US law (especially in the world post-reinterpretation of the Commerce Clause), but it is perhaps worth observing how close to the cliff of "For the good of Society, you must behave optimally, Citizen" such reasoning treads.
For example: AI-generated CP (or hand-drawn illustrations) are viscerally repugnant, but does the same "individual suffering and social damage" reasoning apply to making them illegal? The FBI says yes to both in spite of the fact that we can name no human that was harmed or was unable to give consent in their fabrication (handwaving the source material for the AI, which if one chooses not to handwave it: drop that question on the floor and focus on under what reasoning we make hand-illustrated cartoons illegal to possess that couldn't be applied to pornography in general).
> the individual suffering and social damage caused by the actions that he incentivizes
That's some convoluted way to say he deserves 25 years because he may (or may not) at some point in his life molest a kid.
Personally i think that the idea of convicting a man for his thoughts is borderline crazy.
User of child pornography need to be arrested, treated, flagged and receive psychological followup all along their lives, but sending them away for 25 years is lazy and dangerous because when he will get out he will be even worst than before and won't have much to loose.
Very reasonable. Google can flag accounts as CP, but then a judge still needs to issue a warrant for the police to actually go and look at the file. Good job court. Extra points for reasoning about hash values.
Only in the future. Maher's conviction, based on the warrantless search, still stands because the court found that the "good faith exception" applies--the court affirmed the District Court's finding that the police officers who conducted the warrantless search had a good faith belief that no warrant was required for the search.
I guess it's like if someone noticed you had a case shaped exactly like a machine gun, told the police, and they went to check if it was registered or not? I suppose that seems perfectly reasonable, but I'm happy to hear counter-arguments.
The closest "real world" analogy that comes to mind might be a real estate management company uses security cameras or some other method to determine that there is a crime occurring in a space that they are renting out to another party. The real estate management company then sends evidence to the police.
In the case of real property -- rental housing and warehouse/storage space in particular -- this happens all the time. I think that this ruling is imminently reasonable as a piece of case law (ie, the judge got the law as it exists correct). I also thing this precedent would strike a healthy policy balance as well (ie, the law as it exists if interpreted how the judge in this case interprets it would a good policy situation).
This is something I don’t think needs analogies to understand. SA/CP image and video distribution is an ongoing moderation, network, and storage issue. The right to not be under constant digital surveillance is somewhat protected in the constitution.
I like speech and privacy and am paranoid of corporate or government overreach, but I arrive at the same conclusion as you taking this court decision at face value.
What's the new legal loophole? I believe what's described above is the same as it's been for decades, if not centuries.
Disclosure: I work at Google but not on anything related to this.
Dead Comment
Unfortunately the decision didn't mention this at all even though it is important. If it was even as good as a md5 hash (which is broken) I think the search should be allowed without warrant because even though a accidental collision is possible odds are so strongly against it that the courts can safely assume there isn't (and of course if there is the police would close the case). However since this has is not that good the police cannot look at the image unless Google does.
I think it would obviously be less than ideal for Google to require an employee visually inspect child pornography identified by image hash before informing a legal authority like the police. So it seems more likely that the remedy to this situation would be for the police to obtain a warrant after getting the tip but before requesting the raw data from Google.
Would the image hash match qualify as probable cause enough for a warrant? On page 4 the judge stops short of setting precedence on whether it would have or not. Seems likely that it would be a solid probable cause to me, but sometimes judges or courts have a unique interpretation of technology that I don't always share, and leaving it open to individual interpretation can lead to conflicting results.
Now you’ve got me interested in what’s going on under the hood, lol. It’s probably like any other statistical model: you can decrease your false negatives (images people have cropped or added watermarks/text to), but at the cost of increased false positives.
These hashes are not collision-resistant.
If it was a cryptographic hash (apparently not), this mathematical near-certainty is necessary but not sufficient. Like cryptography used for confidentiality or integrity, the math doesn't at all guarantee the outcome; the implementation is the most important factor.
Each entry in the illegal hash database, for example, relies on some person characterizing the original image as illegal - there is no mathematical formula for defining illegal images - and that characterization could be inaccurate. It also relies on the database's integrity, the user's application and its implementation, even the hash calculator. People on HN can imagine lots of things that could go wrong.
If I were a judge, I'd just want to know if someone witnessed CP or not. It might be unpleasant but we're talking about arresting someone for CP, which even sans conviction can be highly traumatic (including time in jail, waiting for bail or trial, as a ~child molestor) and destroy people's lives and reputations. Do you fancy appearing at a bail hearing about your CP charge, even if you are innocent? 'Kids, I have something to tell you ...'; 'Boss, I can't work for a couple weeks because ...'.
I am not at all opposed to any of this "get a damn warrant" pushback from judges.
I am also not at all opposed to Google searching it's cloud storage for this kind of content. There are a lot of things I would mind a cloud provider going on fishing expeditions to find potentially illegal activity, but this I am fine with.
I do strongly object to companies searching content for illegal activity on devices in my possession absent probable cause and a warrant (that they would have to get in a way other than searching my device). Likewise I object to the pervasive and mostly invisible delivery to the cloud of nearly everything I do on devices I possess.
In other words, I want custody of my stuff and for the physical possession of my stuff to be protected by the 4th amendment and not subject to corporate search either. Things that I willingly give to cloud providers that they have custody of I am fine with the cloud provider doing limited searches and the necessary reporting to authorities. The line is who actually has the bits present on a thing they hold.
Deleted Comment
On topic, I like this quote from the first page of the opinion:
>A “hash” or “hash value” is “(usually) a short string of characters generated from a much larger string of data (say, an electronic image) using an algorithm—and calculated in a way that makes it highly unlikely another set of data will produce the same value.” United States v. Ackerman, 831 F.3d 1292, 1294 (10th Cir. 2016) (Gorsuch, J.).
It's amusing to me that they use a supreme court case as a reference for what a hash is rather than eg. a textbook. It makes sense when you consider how the court system works but it is amusing nonetheless that the courts have their own body of CS literature.
Maybe someone could publish a "CS for Judges" book that teaches as much CS as possible using only court decisions. That could actually have a real use case when you think of it. (As other commenters pointed out, the hashing definition given here could use a bit more qualification, and should at least differentiate between neural hashes and traditional ones like MD5, especially as it relates to the likeliness that "another set of data will produce the same value." Perhaps that could be an author's note in my "CS for Judges" book.)
At last, a form of civic participation which seems both helpful and exciting to me.
That said, I am worried that lot of necessary content may not be easy to introduce with hard precedent, and direct advice or dicta might somehow (?) not be permitted in a case since it's not adversarial... A new career as a professional expert witness--even on computer topics--sounds rather dreary.
1. That's weird and represents an operational error that breaks the rules.
2. That's weird and represents a potential deficiency in how the system or rules have been made.
I don't think anyone is suggesting #1, and #2 is a lot more defensible.
Deleted Comment
>How does AOL's screening system work? It relies on hash value matching. A hash value is (usually) a short string of characters generated from a much larger string of data (say, an electronic image) using an algorithm—and calculated in a way that makes it highly unlikely another set of data will produce the same value. Some consider a hash value as a sort of digital fingerprint. See Richard P. Salgado, Fourth Amendment Search and the Power of the Hash, 119 Harv. L. Rev. F. 38, 38-40 (2005). AOL's automated filter works by identifying the hash values of images attached to emails sent through its mail servers.[0]
I don't have access to this issue of Harvard Law Review but looking at the first page, it says:
>Hash algorithms are used to confirm that when a copy of data is made, the original is unaltered and the copy is identical, bit-for-bit.[1]
This is clearly referring to a cryptographic hash like MD5, not a perceptual hash/neural hash as in Google. So the actual source here is not necessarily dealing with the same matters of fact as the source of the quote here (although there could be valid comparisons between them).
All this said, judges feel more confident in citing a Supreme Court case than a textbook because 1. it is easier to understand for them 2. the matter of fact is then already tied to a legal matter, instead of the judge having to make that leap himself and also 3. judges are more likely to read relevant case law to begin with since they will read it to find precedent in matters of law – which are binding to lower courts. This is why a "CS for Judges" could be a useful reference book.
Lastly, I should have looked a bit more closely at the quoted case. This is actually not a supreme court case at all. Gorsuch was nominated in 2017 and this case is from 2016.
[0] https://casetext.com/case/united-states-v-ackerman-12
[1] https://heinonline.org/HOL/LandingPage?handle=hein.journals/...
So this means this conviction is upheld but future convictions may be overturned if they similarly don't acquire a warrant?
This "good faith exception" is so absurd I struggle to believe that it's real.
Ordinary citizens are expected to understand and scrupulously abide by all of the law, but it's enough for law enforcement to believe that what they're doing is legal even if it isn't?
What that is is a punch line from a Chapelle bit[1], not a reasonable part of the justice system.
---
1. https://www.youtube.com/watch?v=0WlmScgbdws
Note that this case is not about ignorance of the law. This is I knew the law and was trying to follow it - I just honestly thought it didn't apply because of some tricky situation that isn't 100% clear.
Now, there is prior case law declaring it illegal.
The ruling is made in such a way to say “we were allowing this, but we shouldn’t have been, so we wont allow it going forward”.
I am not a legal scholar, but that’s the best way I can explain it. The way that the judicial system applies to law is incredibly complex and inconsistent.
This ends up feeling a lot like gambling in a casino. The casino can afford to bet and lose much more than the individual.
It simply means that at the trial court level, future prosecutions will not be able to rely on the good faith exception to the exclusionary rule if warrantless inculpatory evidence is obtained under similar circumstances. If the governement were to try to present such evidence at trial and the trial judge were to admit it over the objection of the defendant, then that would present a specific ground for appeal.
This ruling merely bolsters the 'better to get a warrant' spirit of the Fourth Amendment.
> "the good faith exception to the exclusionary rule supports denial of Maher’s suppression motion because, at the time authorities opened his uploaded file, they had a good faith basis to believe that no warrant was required."
In no other context or career can you do anything you want and get away with it just as long as you say you thought you could. You'd think police offiers would be held to a higher standard, not no standard.
Isn't that the motto of VC? Uber, AirBnB, WeWork, etc...
> you do any illegal action you want and get away with it just as long as you say you thought you could.
And as for corporations: that's the point of incorporating. Reducing liability.
If a doctor stops to help someone and hurts them because the doctor did something stupid, that is malpractice and could get them sued and maybe get their license revoked.
Would you hire a programmer who refused to learn how to code the claimed "good faith" every time they screwed things up? Good faith shouldn't cover willful ignorance. A cop is hired to know, understand, and enforce the law. If they can't do that, they should be fired.
The reasonable here is "google said it", and it was true.
If the police arrive at a house on a domestic abuse call, and hears screams for help, is breaking down the door done in good faith?
Many white collar crimes, financial and securities fraud/violations can be thwarted this way
Basically, ignorance of the law is no excuse except when you specifically write the law to say it is an excuse
Something that contributes to the DOJ not really trying to bring convictions against individuals at bigger financial institutions
And yeah, a lot of people make sure to write their industry’s laws that way
Deleted Comment
But yeah, this opens too wide gates of precedence for tyranny, unfortunately...
It feels like it incentivizes the police to minimize their understanding of the law so that they can believe they are following it.
That's a bingo. That's exactly what they do, and why so many cops know less about the law than random citizens. A better society would have high standards for the knowledge expected of police officers, including things like requiring 4-year criminal justice or pre-law degree to be eligible to be hired, rather than capping IQ and preferring people who have had prior experience in conducting violent actions.
It is not so lax as that. It's limited to situations where a reasonable person who knows exactly what the law and previous court rulings say might conclude that a certain action is legal. In this case, other Federal Circuit courts have ruled that similar actions are legal.
Still, 25 years for possessing kiddie porn, damn.
There's also an interesting question wrt simulated (drawn, rendered etc) CSAM, especially now that AI image generators can produce it in bulk. There's no individual suffering nor social damage involved in that at any point, yet it's equally illegal in most jurisdictions, and the penalties aren't any lighter. I've yet to see any sensible arguments in favor of this arrangement - it appears to be purely a "crime against nature" kind of moral panic over the extreme ickiness of the act as opposed to any actual harm caused by it.
That's not outside the purview of US law (especially in the world post-reinterpretation of the Commerce Clause), but it is perhaps worth observing how close to the cliff of "For the good of Society, you must behave optimally, Citizen" such reasoning treads.
For example: AI-generated CP (or hand-drawn illustrations) are viscerally repugnant, but does the same "individual suffering and social damage" reasoning apply to making them illegal? The FBI says yes to both in spite of the fact that we can name no human that was harmed or was unable to give consent in their fabrication (handwaving the source material for the AI, which if one chooses not to handwave it: drop that question on the floor and focus on under what reasoning we make hand-illustrated cartoons illegal to possess that couldn't be applied to pornography in general).
That's some convoluted way to say he deserves 25 years because he may (or may not) at some point in his life molest a kid.
Personally i think that the idea of convicting a man for his thoughts is borderline crazy.
User of child pornography need to be arrested, treated, flagged and receive psychological followup all along their lives, but sending them away for 25 years is lazy and dangerous because when he will get out he will be even worst than before and won't have much to loose.
Porn of/between consenting adults is fine. CSAM and sexual abuse of minors is not pornography.
EDIT: I intended to reply to the grandparent comment
Dead Comment