Readit News logoReadit News
zitterbewegung · 7 years ago
I wonder if all they did was parallel reconstruction . None of what was in the article was doable or feasible. Predicting crime like they said they could is just probably another form of profiling but just automated instead . Figuring out influencers in a social network is hard from my experience as an undergraduate researcher and community detection can be done but to predict people’s behavior is quite another. I wonder if any other customers of Palantir will have more fallout if people find out what they are doing ?

http://leitang.net/presentation/Community%20Detection%20in%2...

https://www.researchgate.net/publication/260598010_PREDICTIN...

tptacek · 7 years ago
First, it's "parallel construction", not "parallel reconstruction".

Second, that term refers to the use of SIGINT (or, more likely, aggregates) collected by intelligence agencies to inform law enforcement but not add to evidence available to prosecutors.

Despite the scary name, Palantir is not in fact a signals intelligence agency.

There is a lot to be concerned about with police department gang member databases, but police departments predict crime routinely. It's a core part of what it means to run a large city police department. You don't allocate patrols uniformly across the city; that makes no sense.

bigiain · 7 years ago
Seems to me the major ethical problem with "parallel construction" isn't whether or not the information used to identify and detain the suspect comes from a signals intelligence agency, it's the intentional deception by law enforcement investigators and prosecutors of how and why they acquired and discovered evidence.

If the NSA illegally (or "legally" as they'd no doubt claim) intercepts private communications, and tells local traffic cops to find a pretext to pull over a particular car and search it - telling the defence and the court that the drugs were discovered in a routine traffic stop is parallel construction.

If the tip off comes from Palantir instead of the NSA, and the investigators and prosecutors deceive the court and defence about that involvement - I'd argue it's still parallel construction.

rhizome · 7 years ago
Second, that term refers to the use of SIGINT (or, more likely, aggregates) collected by intelligence agencies to inform law enforcement but not add to evidence available to prosecutors

It's not this narrow. Stingrays have been used to create a map of behavior and stuff that is later executed upon.

https://www.techdirt.com/articles/20180110/14482038982/repor...

akira2501 · 7 years ago
> You don't allocate patrols uniformly across the city; that makes no sense.

And simply concentrating on areas with "high _reported_ crime" isn't a winning strategy, either.

zitterbewegung · 7 years ago
Ok, the experience I had with Palantir was I saw them at Reflections Projections when they were hiring for a software engineering position. They gave me the impression that not only they performed data collection using an ontology but they also performed inference and also analysis on that ontology.

When I also interacted with their software as a trial user their marketing material made it seem like they not only performed data storage of semantic data but also they gave insight to that data. From my impression they not only created databases but also aided in the analysis of the resulting data.

I must have been mistaken in my analysis and interaction and they fleeced me. Thank you for correcting me. (Also for the typo correction). I appreciate your insight in this matter also.

forapurpose · 7 years ago
> police departments predict crime routinely. It's a core part of what it means to run a large city police department. You don't allocate patrols uniformly across the city; that makes no sense.

Predicting crime at the geographic level is much different than predicting it at the individual level, in terms of people's freedom and rights. EDIT: And even on the geographic level, it leads to many abuses including 'stop and frisk'.

evan_ · 7 years ago
maybe "parallel construction" is not the right term for this but I think the suggestion is, the investigators may have found the evidence through an illegal method that would not stand up in court and then made it seem like they arrived at the evidence using Palintir or some other legal method.
ggg9990 · 7 years ago
Just because you weren’t able to identify influencers in social networks as an undergraduate researcher doesn’t mean that Palantir hasn’t figured it out.
jstarfish · 7 years ago
Link analysis of criminal organizations is something literally every detective knows how to do. I don't understand the difficulty here unless they think finding good marketing candidates on Instagram is anything like a criminal investigation.

Cops keep track of your associations (hence why they can no longer force you to show them the contents of your phone; your contact history is juicy intel that used to reveal lots of links!). Modern technology just makes it easier.

Deleted Comment

aresant · 7 years ago
This article suggests that the Palantir software, regardless of method, was in fact instrumental in identifying a gang leader who was subsequently sentenced to 100 years in prison.

And that the reason that it was cancelled was largely due to the public's discomfort with the program, as raised by a previous Verge article, that laid out the potential for civil liberties violations and potential macro ineffectiveness of Palantir's identification methodology.

With that in mind it seems like Palantir's largest risk forward is running afoul of due-process for their criminality / policing divisions, not that municipalities won't fall over themselves to hand them billions in fees in the name of efficiency.

ineedasername · 7 years ago
The article suggests the opposite: The prosecutor contends that Palantir was a non-factor in the conviction, hence its lack of inclusion in material disclosed to the defense. The reason for the contract getting cancelled was also not confirmed or commented on by the city, the article only mentioned "some" that were "leery" of its use because it could be used to connect gang members to others. This is a very vague wording of any concerns about use of Palantir. The article was very light on content here, using words that indicate trepidation but not connected those loaded terms to any explanation.

From prior stories about Palantir's lack of efficacy outside of well-resources intelligence and military venue, my guess is that lack of efficacy was the cause for the contract going belly up.

bigiain · 7 years ago
> The prosecutor contends that Palantir was a non-factor in the conviction

Parallel construction as a Service.

I bet with the right co-founder you'd be rolling in investment capital for that...

segmondy · 7 years ago
Rookie cops making $30k a year have also identified such criminals. Bring sentenced to 100 yrs means nothing and is irrelevant
geofft · 7 years ago
It is relevant because that specific conviction caused the public discomfort. (The defendant is claiming that the group he's affiliated with is not a gang, simply a group of acquaintances, and Palantir's analysis caused police to believe it was a gang.)
everdev · 7 years ago
It's amazing how companies can get so big predicting the future. The process is a glorified dart throw, especially in social systems. The variables are always changing, and the effect of reacting to a prediction (like increasing police presence) changes the expected outcome, so predictions become elusive even if initially correct.

Also, when the variables like homicides are relatively low numbers, there are huge percentage swings that happen naturally.

Macro indicators like weather and per capita income have long been known to be correlated with crime, but trying to predict and proactively reduce it is much harder.

The promise of AI and predictive analytics is huge, but it misses the mark in non-closed systems.

Analemma_ · 7 years ago
I doubt the actual effectiveness or usefulness of the system mattered much to the police department. Most likely this was used for one of two purposes: a) as mentioned above, parallel construction and b) an excuse to arrest people they wanted to arrest anyway, with the system as cover.
ineedasername · 7 years ago
I don't see the parallel construction here. Parallel construction would have to involve use of material that should require a warrant, but one was never obtained. Just generating valuable leads that result in investigation that leads to a warrant isn't parallel construction.
darawk · 7 years ago
All of those things are true x10 in the stock market, but people still seem to be able to consistently make money. I think you're overstating the difficulty of the problem a bit. There's a lot you can do with simple models to predict crime - they're not perfect, but as long as their users understand that, that's fine.
notahacker · 7 years ago
Also, certain crimes like drug use are common to the point where even models highlighting "links" or "suspects" purely at random might yield increased prosecutions if law enforcement believed in the model enough to investigate more thoroughly than they would if simply following a "hunch", carrying out random checks or conducting a "sweep of the area". And if a model is largely uncorrelated with actual rather than revealed propensity to commit crime, and rather more closely linked to demographics, then its effect on who police prioritise their resources chasing could be a big problem. Certainly a far more likely problem than a model failing to surface any evidence of any criminal activity.

For other reasons, criminals citing false negatives from limited social network based profiling tools failing to identify them as a gang member as a case for their defence, as the article suggested one New Orleans defendant hoped to do, would also be problematic.

daxorid · 7 years ago
If I may ask, how old are you?

I am increasingly seeing people whose entire experience of the markets being after 2009 believe that it's trivial to "consistently make money".

Prior to the central banks of the world eliminating volatility and backstopping all asset classes, this was in fact not the case.

everdev · 7 years ago
Not all stock markets have the "consistent" growth the US has. I put the word in quotes because that exact notion of the past predicting the future it's what's inherently wrong with many of these companies.

It's super easy to create a model that could beat the market when run against historical data. It's a whole other thing to beat the market in real time.

Evidence of this would be that's it's possible to leverage stock positions by multiple factors. If anyone can accurately predict the future and profit from it, then it's easy to turn $5k into $5M with enough leverage and compounding profits. The problem is no one can do it consistently at that level of accuracy.

smogcutter · 7 years ago
Winning, on average, a few more bets than you lose works in the stock market. In policing it's completely unacceptable (or should be).

Deleted Comment

drb91 · 7 years ago
Nobody is more confident than a hedge fund trying to get your cash. It’s still a zero sum game and not really an applicable metaphor here.
miked · 7 years ago
Of particular concern to some was the use of Palantir as a tool that could aid investigators both in connecting suspected gang members to others in the community, and in identifying people deemed at high risk of either committing gun violence or being the victim of it.

??? Isn't that precisely what the tool is meant to do?

Havoc · 7 years ago
I'd imagine they got the data they needed anyway
cityofghosts · 7 years ago
and mr palantir got up from his desk, steaming envelopes, and walked down the street, into the future. "ill be a postman" he thought. "wont have to spy on anybody".
sly010 · 7 years ago
Since we are talking about a technology that basically speculates about people, it's kind of funny that according to the article the "New Orleans Police Department's Director of Analytics" is called Ben Horwitz. Hmm...
alexilliamson · 7 years ago
What are you implying?