Readit News logoReadit News
elihu · 5 years ago
> “When they legislate these things, they need to be appropriating enough money to make sure they work,” a source said. They estimated fixing the SB1310 bug would take roughly 2,000 additional programming hours.

40 hours a week times 52 weeks is 2080 hours. Subtract a few weeks for vacations and holidays, and you get a little less that 2000 hours. So, basically, this is a little more than one programmer-year of effort if the estimate is in the right ballpark.

It's gross that the decision not to fix this carries an apparent implicit economic calculation that one programmer-year is more valuable than the freedom that is being denied to an unknown number of people whom society deems less important. (Granted the actual situation is more complicated and the state is constrained by their contract with the vendor, which we can reasonably guess is going to charge as much as they can contractually get away with rather than the programmer's actual salary cost.)

At least the Department of Corrections has assigned people to do the calculations manually. That's better, but it sounds like they just don't have enough people on it to keep up.

maybenotafart · 5 years ago
That is horse shit and we all know it. What bug takes 2k hours???? thats 250 work days. jesus christ, if I took that long to fix a bug, fire me. And yes, I'm also talking about time to test, write/fix unit tests, write/fix integration tests, releasing into production, and data conversion.
hn_throwaway_99 · 5 years ago
Look at the description of the issue. It's really less of a bug and more of a feature request, in the sense that the legislature changed the rules for how "earned release credits" could be calculated. All of the details are here: https://corrections.az.gov/sites/default/files/documents/PDF... .

Previously, it seems like there was a single standard, applied universally: 1 day of earned release credit for every 6 days served. The new rules have many more inputs, with lots of caveats: only certain offenses are eligible, and the inmate can not have been convicted of some other types of offenses, and the inmate must have completed some specific courses, and the inmate can't have previously been convicted of certain felonies.

The 2k hours may very well be excessive, and I don't care if it takes 20k hours, it means they should mothball their software and do it manually if that's the case, but just calling it a "bug" is misleading IMO.

m463 · 5 years ago
It would sure get fixed if it was releasing prisoners early.

You'd have the private prisons and the prison guards union climbing up everyone's posteriors.

adrianN · 5 years ago
You don't know the codebase. Even people who know a codebase have a hard time giving accurate estimates.
rsj_hn · 5 years ago
> It's gross that the decision not to fix this carries an apparent implicit economic calculation

Spending money will remain economic decision until we can have government agencies fueled by the righteous indignation of their critics rather than having a line item added to their budget. Until you can convert that indignation into legal tender, agencies will remain subject to old fashioned accounting constraints.

craig131 · 5 years ago
The onerous budget item we are talking about here is a feature that multiply days sentenced by 0.7 if the inmate completes one checkbox item. You know, just to keep things in perspective.
ipsin · 5 years ago
I understand that there's a lot of work that could go into this sort of thing (mocks, accessibility, testing)... but is 2000 hours really a defensible number? It sounds like there's new per-inmate data and calculations for inmate eligibility and sentencing credit. But 2000 hours worth of work? Even sand-bagging it sounds like way too much.
pixl97 · 5 years ago
Of course you're assuming they have anyone left at the company that understand the software. So many times the original team that wrote it is gone, and there is a plate of spaghetti left for the next group to figure out.
codyb · 5 years ago
I was pretty incredulous until a cousin poster started talking about auditing, testing, red tape, out of date systems, and security.

Seems plausible it could balloon.

maybenotafart · 5 years ago
sounds much more like budget politics against the incompetent
JumpCrisscross · 5 years ago
> an apparent implicit economic calculation that one programmer-year is more valuable than the freedom that is being denied to an unknown number of people whom society deems less important

I’m surprised this doesn’t create a massive liability for the state.

myth2018 · 5 years ago
I've had the opportunity of participating on the design phase of an application to control the distribution of ostomy bags for a network of public hospitals. I was a bit shocked when I knew that the decision-makers were about to cut functionality intended to provide workarounds in case of system failures, so that employees could keep delivering bags upon request.

That would basically result on patients not getting their ostomy bags on time, and I can't even imagine what would follow afterwards. What would be the reactions of patients and their relatives, what levels of stress would hospitals' employees would be subjected to, and so on.

I left the company some months after that, and I don't know what was the final decision, but they'd been warned.

Maybe one day some set of ethical standards will be considered non-functional requirements as important as robustness, security and others.

With technicians being responsible for warning their managers, managers being responsible for assessing risks and documenting their decisions, everything being made transparently and everybody being accountable.

elihu · 5 years ago
Yeah, I'd assume this would be resolved real quick if the state (or the contractor responsible for the software) had to pay out, say, $100 per inmate per day that they improperly spent in jail past the end of their sentence.

That this problem is allowed to persist seems like an indication that the people in charge believe that prisoners have a low probability of successfully suing the state for damages.

TexasfoldsEm · 5 years ago
Wow. I know firsthand from family how this can severely destroy someone's mental health in what may not be so obvious; it is extremely heavy on someone every moment past the first hour they go past their release time, then the first day followed by a variety of things that will then be taken advantage of by other inmate and guards while one's defenses are down. The fun poked at by other jealous inmates and cruel guards constantly will also weigh down hard on another human being. Arizona penal system puts you into almost always into very nasty and dangerous places of incarceration. frompdx made a statement that truly made my gut feel as if I was at the top of a roller coaster I did not want to get on in the first place.
WalterBright · 5 years ago
If a prisoner knows he's past his release date, can't he contact his lawyer?
omginternets · 5 years ago
I'm worried this question might get written off. I would actually like to know the answer to this as well.

My immediate reaction is that either (1) it is possible, and the story is therefore more nuanced that might appear at first glance, or (2) it is not possible, and this is an even more egregious problem.

kar5pt · 5 years ago
From reading the article it sounds like laws dictating when a prisoner is released are pretty complicated and many prisoners may not know if they should be released early based on good behavior or changing laws.

That's just my assumption. Remeber prisoners tend to be from less privileged backgrounds and some may be very ignorant of how the law works or even functionally illiterate. So things that seem "obvious" to educated engineers may not be obvious to them.

mulmen · 5 years ago
Depends on if the computer has taken away phone privileges. I suppose a good lawyer would already know the release date and take action without being contacted? But I have no idea.
alistairSH · 5 years ago
You assume people who are jailed have lawyers on retainer. They largely don't.
oh_sigh · 5 years ago
Sure, that still takes time to get through the system though. Probably months between scheduling and final release.

More commonly though the people wouldn't even know to contact their lawyer, because they are credited for time served pre-conviction.

jdeibele · 5 years ago
When I wanted to have compiled [1] financials, PriceWaterhouseCoopers told me to pick a recognized accounting system, then change the company's business processes to match that. They said absolutely not to go the other way, to try to customize any software to match our business.

I think about that every time I read about another government (or private!) company that wastes tens or hundreds of million of dollars (or euros or pounds) on custom software.

It seems like there should be 1, 2, or 3 DMV programs. The same for building codes, tax codes, etc. And prison software. You can be more like Massachusetts or Mississippi or Montana (hypothetical examples) but pick one and harmonize with it.

1: compiled is the lowest of 3 standards that outside accountants can do; "reviewed" is higher and "audited" is the highest. Even at the compiled level they mailed out postcards to a certain number of customers asking if they were customers over the past year and had spent this much money. It was fairly easy for the acquiring company's outside accountants to review PWC's work and bring it up to audited standard.

reillyse · 5 years ago
This appears to me to be a terrible idea. In effect you would have private companies writing the laws of the land. "I'm sorry California you can't change your laws because it doesn't fit into the three options we have available at our preferred software vendor". Seems like the tail wagging the dog.
toomuchtodo · 5 years ago
Login.gov cribbed off of the UK’s digital office that built a similar system. I believe that’s what OP was alluding too.

How many unemployment systems, prisoner tracking systems, DMV systems do you need? These are common components across governments.

Example: Login.gov now supports local and state government partners. Your constituent IAM needs can now be met by a federal team that is efficient and competent, instead of every city and state reinventing the wheel (poorly and in expensively).

jdeibele · 5 years ago
It could go that way. But the idea is that Massachusetts might charge EVs extra license fees because they want to replace the lost gas taxes whereas Montana and Mississippi wouldn't. Massachusetts already has different and higher pollution regulations (typically based on California's).

Other states might want to do the same, although the fees would probably differ. So the idea is that 10 or 15 states cluster around one solution for a department, 20 for another, 10 for a third and the rest go their own way. The states would have a lot of power in being able to replace working solution A with B or C. So there's 3 or 4 DMV vendors, there's 3 or 4 unemployment vendors, some for contact tracing (my state of Oregon still hasn't implemented the Google/Apple tracing), and so on.

The current situation is that you know a potential replacement will be late and over budget, you just don't know exactly how bad it will be. And Accenture and IBM like it that way and are very adept at persuading the decision makers that they're very special snowflakes and can't use an off-the-shelf solution.

boomboomsubban · 5 years ago
You can solve this problem pretty easily by using free software projects.
airstrike · 5 years ago
An alternative is to have the federal government offer the "federal choice" which states and local governments can choose to use instead of rolling out their own.
spaetzleesser · 5 years ago
In states, counties and cities a lot of contracting basically has the purpose of pushing money to well connected people. They don’t want an efficient and cost effective solution.

I know somebody who audits municipalities. We did a graph that showed relations between different players. It’s basically just a big insider club of usually 20-40 people and families that give contracts to each other at the expense of the tax payer.

curryst · 5 years ago
That's hard to do because all of those systems are intertwined. If you use the Montana DMV program, and you want people who get DUIs to have their driver's license suspended, now you have to use the Montana Penal System program. Except Montana's Penal System has a bunch of exceptions written into it for laws that can be either a misdemeanor or a felony, and they don't allow any time off for good behavior. So now are you going to adopt Montana's laws too? There's tax code stuff in there too, so I guess we're lumping in the Montana tax system as well.

I think the problem is that unlike our more notable branches, we don't hire experts in the field. I don't mean they're incompetent at technology, but that a problem like this really exists at the intersection of government and technology. We keep hiring general-purpose contractors to build things like this, and then we're shocked when it falls apart in the environment governments exist in.

We need companies that specialize in this intersection. Companies that can keep public sentiment in mind and build an architecture that's flexible in the places where society is. It's the same way that most of us in general purpose IT try to build systems that can adapt to changes in the IT landscape. Put it in Docker so we can run it on a cloud, on bare metal, on k8s and probably on whatever's next. Governments struggle to pivot like that due to funding (how do you argue for funding for features since you can't earn revenue?), and because a lot of it is legislated out of their control. Learning to read the public sentiment is just like us reading trends in a newsletter.

er4hn · 5 years ago
This advice appears based on deficiencies in programming however. Programs operate on algorithms to process data. When the programs or algorithms fail to be able to do so properly the program is at fault.

In your cases you have items like: accounting, building codes, tax codes, automobile codes, etc.

While it makes sense to try and harmonize with the general policies, every state, every municipality, and every business is going to have special cases. Even software has edge cases for protocol behaviors.

What would be nicer, imho, is if all of these laws were written in domain specific languages that specify the law and then the software could just pick up the definitions signed into law. Lawyers as they are feel like a combination of legal interpreters, combined with a combination of being red/blue security team members depending on what they are doing.

sterlind · 5 years ago
My dad actually created a (failed) startup in the early '00s that modeled immigration law in Prolog, enabling the creation of legally accurate forms and resolving complex legal queries. It was a good idea, it just failed due to infighting and mismanagement.

are there popular languages for implementing these types of DSLs?

ohboichamois · 5 years ago
FWIW this is actually (mostly) the case for building codes. The standard in the USA is the International Building Code (and other related code by the International Code Council), which each jurisdiction adopts into law and amends as needed for local conditions or practices. And these codes in turn reference other international standards specific to the knowledge domain.
dragonwriter · 5 years ago
> It seems like there should be 1, 2, or 3 DMV programs. The same for building codes, tax codes, etc. And prison software. You can be more like Massachusetts or Mississippi or Montana (hypothetical examples) but pick one and harmonize with it.

Kind of defeats the entire purpose of having states to start with.

If we want to make the US a centralized, unitary state, let's do that through the elected central government and not through deferral to IT contractors.

notwhereyouare · 5 years ago
That article just kept getting worse and worse. They mention assigning a penalty to the wrong inmate and they couldn't fix it.

All of a sudden that person could no longer make calls for 30 days, and they did nothing wrong to get that.

toomuchtodo · 5 years ago
“Show me the incentives, I’ll show you the outcome.”

If corrections staff were held personally liable for these failures, or the local jurisdiction faced steep financial penalties, it wouldn’t happen. No liability, no responsibility.

m463 · 5 years ago
> "Show me the incentives, I’ll show you the outcome."

That is spot on, and generalizes well.

"iot vendors make post-sales money if they collect data from their device"

"phone vendors make money if they bundle terrible apps with their phone"

"robocallers make lots of money, with historically no fines paid out for violations"

blobbers · 5 years ago
These are prison workers and you're asking them to run a social network (with certain constraints).

They wouldn't even know the first thing about how to hire someone capable of doing this. They'd have to hire a consultant to hire another consultant.

alex_young · 5 years ago
Isn't this clearly defined false imprisonment under Arizona law?

Here's the relevant statute:

13-1303. Unlawful imprisonment; classification; definition

A. A person commits unlawful imprisonment by knowingly restraining another person.

B. In any prosecution for unlawful imprisonment, it is a defense that:

1. The restraint was accomplished by a peace officer or detention officer acting in good faith in the lawful performance of his duty; or

2. The defendant is a relative of the person restrained and the defendant's sole intent is to assume lawful custody of that person and the restraint was accomplished without physical injury.

C. Unlawful imprisonment is a class 6 felony unless the victim is released voluntarily by the defendant without physical injury in a safe place before arrest in which case it is a class 1 misdemeanor.

D. For the purposes of this section, "detention officer" means a person other than an elected official who is employed by a county, city or town and who is responsible for the supervision, protection, care, custody or control of inmates in a county or municipal correctional institution. Detention officer does not include counselors or secretarial, clerical or professionally trained personnel.

https://www.azleg.gov/ars/13/01303.htm

Assumption being that a detention officer is not acting in good faith if they have a list of people who should no longer be detained under state law.

chordalkeyboard · 5 years ago
You’re assuming that agents of the government are expected to follow their own laws. Those laws are for you and me, not our beknighted public servants.
boomboomsubban · 5 years ago
Their list is of people eligible for a program that would give them an early release, so unless the inmate enrolls the prison would be acting in good faith. Almost like the law was intentionally worded to limit their liability.

Deleted Comment

bigwavedave · 5 years ago
> Isn't this clearly defined false imprisonment under Arizona law?

> Assumption being that a detention officer is not acting in good faith if they have a list of people who should no longer be detained under state law.

I agree with your premise and assertion, but I'm not sure that's exactly what's happening here. I'd like to preface this by saying I absolutely believe there need to be ramifications; I'm just not sure that it fits "clearly defined false imprisonment." I think a category would have to be added to the false imprisonment statute for "negligence" for this to be considered false imprisonment and let me tell you why:

From what I can tell, this article is talking about a couple of massive issues but the wrongful imprisonment bit is about a specific bug (SB1310) in ACIS that can't calculate an updated release date for inmates that complete special programs that award additional release credits as per an amendment signed into law in 2019. Since they can't automatically update a release date for individuals that have completed this program, they keep track of it manually. To me, the article doesn't read like they have a list of people who should be released but aren't being released because the software says so; from my very limited perspective it reads like there are certain programs an inmate can complete to earn extra release credits and since the system can't track these extra credits, the detention officers do it manually. I would imagine their manual process goes something like this:

1) Compile list of inmates that have earned extra release credits through the aforementioned release programming.

2) Select inmate from list, possibly in order of original release date, earliest first.

3) Calculate the amount of release credits they received from completion of the programming.

4) Calculate the total hours those credits equal.

5) Deduct hours from release date.

6) Manually update the release date in ACIS (likely requiring warden and/or judicial approval, but idk).

6a) Since ACIS now has the appropriate release date, the inmate will be processed for release now (if the date has passed) or as they normally would be.

6b) Remove inmate's name from list unless currently enrolled in early release programming, in which case they are moved to the bottom of the queue.

7) Lather, rinse, repeat.

Being denied release because of a software error would be hellish for both an inmate and their loved ones... But because it doesn't seem like they have an actual list of people that should have already been released but haven't been because the software made a critical oversight, I don't think it fits the legislation as it exists today for false imprisonment. The tool is broken so they've switched to manual calculation until someone more important decides it's worth fixing.

If we add negligence to the false imprisonment statute, I'd agree wholeheartedly! But IA[very_much]NAL, so I'll confess I don't really know anything about anything.

EDIT: formatting

woodruffw · 5 years ago
To color this even further: the hundreds of people who are illegally imprisoned are being held for drug or even just paraphernalia possession. The law that grants them credits explicitly excludes violent felons[1].

[1]: https://corrections.az.gov/sites/default/files/documents/PDF...

edoceo · 5 years ago
It's like gov system don't even have test cases. They should, and they should be public. Why aren't these softwares for the public open source?

See also: employment security sites, cannabis track and trace, driving license, etc.

Some of these bugs cause direct financial harm to citizens and this one is much worse!

Show me the test cases! Show me the code!!

sodality2 · 5 years ago
If my tax $ goes to it, it should have source available (excepting natsec). it would be nice to get some value out of it. If it's well written, I could learn how a large scale project works. If not, I can have something to petition and voice my concerns about, inform about vulns, etc.
colejohnson66 · 5 years ago
You exempt national security, and suddenly everything is national security. Look at the FISA “courts”.

Not arguing against it. State secrets are needed in some instances. Just pointing out that if you exempt something, there’ll be people who’ll construe as much as they can under than exemption. Is there any solution to that?

adolph · 5 years ago
Well, I can't show you the test cases and code, but the available requirements are pretty tough to go through:

https://www.azleg.gov/ARStitle/

lbriner · 5 years ago
I think there is still a genuine concern that open-source software allows bad people to find loopholes before the good people do. The last thing you want is someone finding a bug that allows a murderer to get released because the computer said-so.

I think it can be managed but it is a genuine concern nonetheless.

nonameiguess · 5 years ago
Restrict access. Why does a prison management system need to be connected to a public network and be accessible to more than 20 or so authorized users? I worked on plenty of government systems using insecure software galore but it didn't really matter because we were air gapped and you needed to get through Fort Knox level physical security to get physical access to a terminal.

Granted, that doesn't make attack impossible, but it does make it very hard, especially when you disable all the USB ports and optical drives and socialize extreme consequences to any employees not following ITSEC rules.

dec0dedab0de · 5 years ago
I would much rather error on the side of releasing someone early, instead of holding people longer.
p_l · 5 years ago
In best cases, the test cases are good and pass... and yet such errors will still abound.

Why? Because the spec for which the tests where written didn't include some contingency, for example with software that rigidly require certain steps to happen and doesn't provide a human-controlled override.

Deleted Comment

frompdx · 5 years ago
This is an outrage. It is also a perfect example of how software is used to create increasingly more elaborate and faceless bureaucracies that force individuals to spend more and more time contending with them. Somehow software has become the ultimate vehicle for bureaucratic violence. Software is simultaneously infallible and the perfect scapegoat. The inmate who lost their phone privileges for 30 days is an example. They did nothing wrong but the computer says so and nothing can be done. The computer is right in the sense that its decision cannot be undone, and solely to blame since no human can undo its edict or be held accountable, apparently. It is tragic and absurd.

There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all. Is this because for the most part our efforts in producing software are actually doing the opposite? It certainly seems that way reading articles like this.

whack · 5 years ago
The following is very illuminating:

> Instead of fixing the bug, department sources said employees are attempting to identify qualifying inmates manually... But sources say the department isn’t even scratching the surface of the entire number of eligible inmates. “The only prisoners that are getting into programming are the squeaky wheels,” a source said, “the ones who already know they qualify or people who have family members on the outside advocating for them.”

> In the meantime, Lamoreaux confirmed the “data is being calculated manually and then entered into the system.” Department sources said this means “someone is sitting there crunching numbers with a calculator and interpreting how each of the new laws that have been passed would impact an inmate.” “It makes me sick,” one source said, noting that even the most diligent employees are capable of making math errors that could result in additional months or years in prison for an inmate. “What the hell are we doing here? People’s lives are at stake.”

Comments like yours seem to glorify a pre-software world filled with manual entry. The reality is that manual entry is even more error-prone, bias-prone, with more people falling through the cracks.

If nothing else, software can be uniformly applied at a mass scale, and audited for any and all bugs. And faulty software can be exposed through leaks like the above, to expose and fix systemic problems. Whereas a world of manual entry simply ignores vast numbers of errors and biases which are extremely hard to detect/prove, and even then, can simply be scapegoated with some unlucky individuals, without any effort to fix systemically.

derefr · 5 years ago
The "right" bureaucratic system isn't one with humans doing calculations (which we're bad at); nor is it one where computers on their own make decisions (which they're bad/inflexible at.)

Instead, it's one where computers do calculations but don't make decisions; and then humans look at those calculations and have a final say (and responsibility!) over inputting a decision into the computer in response to the calculations the computer did, plus any other qualitative raw data factors that are human-legible but machine-illegible (e.g. the "special requests" field on your pizza order.)

Governments already know how to design human-computer systems this way; that knowledge is just not evenly distributed. This is, for example, how military drone software works: the robot computes a target lock and says "I can shoot that if you tell me to"; the human operator makes the decision of whether to grant authorization to shoot; the robot, with authorization, then computes when is best to shoot, and shoots at the optimal time (unless authorization is revoked before that happens.) A human operator somewhere nevertheless bears final responsibility for each shot fired. The human is in command of the software, just as they would be in command of a platoon of infantrymen.

You know policy/mechanism separation? For bureaucratic processes, mechanism is generally fine to automate 100%. But, at the point where policy is computed, you can gain a lot by ensuring that the computed policy goes through a final predicate-function workflow-step defined as "show a human my work and my proposed decision, and then return their decision."

munk-a · 5 years ago
> Comments like yours seem to glorify a pre-software world filled with manual entry. The reality is that manual entry is even more error-prone, bias-prone, with more people falling through the cracks.

I think that the pre-software world was quite bias-prone and extremely expensive for large processing jobs like this. The question is how this system was allowed to transition from the expensive manually managed system that used to be in place to the automatic software driven system that is replacing it at such a cut-rate that gigantic bugs were allowed to sneak in.

It appears this software is primarily used by the state government so why was such a poor replacement allowed as a substitute for the working manual process.

Also, the number of bugs this software has accumulated since Nov 2019 (14000) is astounding enough that I assume it's counting incidents - that's a fair way to go since these are folks' lives, but I'd be curious to know just how bug laden this software actually is.

Although there is another factor here - this specific release program was a rather late feature addition that may not have been covered in the original contract with ACIS since the bill was only signed into law two months before the software was rolled out.

caconym_ · 5 years ago
> The reality is that manual entry is even more error-prone, bias-prone, with more people falling through the cracks.

It doesn’t have to be. But when it’s subjected to the same incentives that produced this software and perpetuated its broken state, we should expect the result to be much the same.

When you pull back and try to look at it with fresh eyes, our prison system is abjectly terrifying. It’s designed to funnel wealth to private entities, not to implement justice or rehabilitate criminals or whatever other worthy goal(s) you might imagine for it. This story (as horrifying as it is just by itself) is only one little corner of the monolithic perversity of the system as a whole, and the executive powers involved in steering that system are about as close to evil as you can find in the real world.

The whole thing needs to be torn down and rebuilt. As long as it exists, it puts the lie to our claim of being a society that values freedom and justice.

Circling back, I guess the point is that the ideas about how to do software in your last paragraph have no chance of being implemented in the system as it currently exists. To fix “systemic problems”, we will have to aim a lot higher with a much bigger gun.

brundolf · 5 years ago
The way I see it, one aspect of this is software literacy. The bureaucrats would only be doing the task by hand instead of fixing the bug (or even cobbling together a more basic automation! Excel could probably get them most of the way there) if they are a) unable to do it themselves, and b) can't/don't want to pay an expert to do it.

We can no longer afford to partition the people who understand/use business logic from the people who turn it into code and maintain that code. Period. It's ridiculous and endemic at this point. This problem permeates virtually every large organization in existence; public or private.

It's partly an issue of education, partly an issue of organizational structuring, and partly an issue of accessibility of technologies. But the sum of these parts has become entirely unacceptable in the year 2021.

UweSchmidt · 5 years ago
One of the issues is that laws are made on paper and then everyone needs to figure out how to map it to software. Instead, laws should be codified in software and legal APIs should be binding. This would do wonders for efficiency, but also force laws to be cleaned up, be consistent, simple and logical.
Clubber · 5 years ago
I don't think it's so much that software is better or worse than manual entry. First, it's the attitude / rules that assume what's in the system is right. Second, it's no real procedure to audit or check the accuracy of the data.

From a professional who works with data systems, you're more likely to have a database with bad data in it that not.

EsotericAlgo · 5 years ago
While not universally true, a manual process does typically require a process to fix mistakes. This is true of software as well but the perceived lack of errors in software processes often leads to this being ignore resulting in the aforementioned “bureaucratic violence”. I do think automated solutions are inherently better because of the bias reasons you call out but it cuts both ways and removes interpretation from processes that may not respect nuance.
dimitrios1 · 5 years ago
We know exactly how to fix it. Our cowardly politicians and toothless regulatory agencies are not up for the challenge.

For every piece of software that can directly and materially harm someone's life like this, there should be a chain of responsibility. And within that chain, there should be legal recourse and, in most cases, penal consequences, especially in the case of inadequate software quality/testing/validation, should the software fail to perform its task correctly. Bonus side effect, software quality will go up across the board in the industry.

RHSeeger · 5 years ago
It will never be the case that software will be perfect. We can get closer and closer, but the closer we are the more expensive the next step in closing the gap is.

While I do agree that making software better/more reliable is a good goal, I believe we would be better off making the system as a whole more robust; the system that includes humans. For every situation where a piece of software has control of something that effects society (individual, group, etc), there should always be a clear and direct means of appealing / pushing back on the decision that was made. Those means should involve a human reviewing the information and making a decision based on that information, not on what the computer said. There's thread after thread of us saying the exact same thing about companies like Google and Facebook; it should apply as a general rule.

throwaway8581 · 5 years ago
It’s not the software makers who are committing the crimes. It’s the people abdicating responsibility to software. You can’t wipe your hands of releasing a prisoner on schedule by delegation that to software. The software can help you with your task, but if it’s brought to your attention that there’s a mistake, your failure to promptly fix it is on you.
daanlo · 5 years ago
Imho that wouldn‘t cause quality to go up. It would just make it more expensive to develop and to fix bugs. Even more cover your ass would go on.

Or at least a huge share of that burden needs to be on the client so that they define and then test and control the SW they receive properly.

The problems with the software sound like typical big software project problems. Trying to cover a huge breadth of use cases with lots of very important tiny details and released in a big bang (one migration). It sounds like more of a project mgmt problem than a software problem to me.

But maybe I am just a hammer and see nails everywhere.

frompdx · 5 years ago
My cynical take is that the lack of accountability is exactly what makes software enabled bureaucracy so appealing. If this true, there is no incentive to change.
P_I_Staker · 5 years ago
Or things don't get much better, and management finds a way to make it someone else's problem. Do you think it's the CEO or the people making these decisions that will be locked up? We'll just be arresting whatever yes men show up to be the pawn in the stupid game someone else architected.

More people working with a gun to their head. I'd rather the gun be pointed at the person who already has a gun pointed at me, instead of both barrels facing in my direction.

kingaillas · 5 years ago
Yeah but the cost of that chain will also rise.

If I'm (or my company is) personally on the hook for bugs, then I'm going to adopt a NASA-like software quality regimen, pushing up the cost of the product.

Every single part of the software stack below me, from hardware, OS, compiler toolchain, disavows responsibility so if I have to absorb all the risk, the product is going to be mind bogglingly expensive.

varenc · 5 years ago
If there’s penal consequences for bad software, you can bet that the development cost will easily 10x overnight.
JustSomeNobody · 5 years ago
> Our cowardly politicians and toothless regulatory agencies are not up for the challenge.

Because their constituents want people to be punished and if the inmates have to suffer a little extra so be it, "they shouldn't have committed a crime."

Our society is severely lacking in empathy.

rsj_hn · 5 years ago
> We know exactly how to fix it. Our cowardly politicians and toothless regulatory agencies are not up for the challenge. For every piece of software that can directly and materially harm someone's life like this, there should be a chain of responsibility.

No, you know how to blame people and punish people, but that doesn't mean you know how to deliver custom bespoke software for a price that the various government agencies can afford which doesn't have bugs that severely hurt peoples lives.

In fact, punishing people is not going to accomplish that.

That's the problem with a legislature that thinks it can pass any law it wants - let's take into account this new variable X that our software has no way of collecting or measuring - without looking at the feasibility of actually implementing the law given the infrastructure available, and without approving a corresponding budget for software upgrades to actually enact the law, and taking into account how much time it would take to write, test, deploy, and then train people to use the new software instead of just issuing streams of mandates like Emperor Norton and expecting the mandates to materialize into existence like the morning dew. And if said morning dew does not appear, then we can punish and sue the people in charge when they tell us there is no way they can do what we are asking them.

Of course there is blame on the prison leadership for covering things up and that leadership should be fired, but you can punish and sue people all day long and it's not going to result in any good code being written. Punish enough people, and it will just result in the Law being repealed.

The problem with this type of bespoke code is that it has exactly 1 customer, so it's going to be horrendously expensive while also being buggy and quickly thrown together compared to software whose development costs are leveraged over millions of customers. And then what happens next year when some crusader decides that they need to take some other new variable into account? Constantly changing requirements, underspecified projects, one-off projects whose schedules are impossible to estimate, and cash strapped local governments. Yeah, that's a recipe for success.

This is why everyone hates enterprise software, but even enterprise software has tens of thousands of customers. Bespoke software for the Arizona prison system -- forget it.

raymondh · 5 years ago
> And within that chain, there should be legal > recourse and, in most cases, penal consequences,

Wikipedia says, "Under common law, false imprisonment is both a crime and a tort".

rjurney · 5 years ago
There is a chain. There is legal recourse. And there are considerations in government IT that you would not believe and they are incredibly difficult to deal with on minimal resources. It has to be harder than the private sector and this application isn't any different than buggy mainframe software run by major banks. It sits and gets crufty.
carlmr · 5 years ago
Maybe every contract like this should be programmed to the same API twice. Then you could at least compare it the two pieces of software agree. You check the disagrees and get the companies to fix them (should be part of the contract).

And don't tell me you can't buy two CRUD applications for 24 million dollars. It's a silly amount of money for such a buggy application.

rectang · 5 years ago
> cowardly politicians

They aren't cowardly; they are responding rationally to a constituency that hates "criminals". Prioritizing fixing discriminatory systems (such as this software, or "stop and frisk", or the death penalty) is bad electoral politics for "tough on crime" politicians.

x86_64Ubuntu · 5 years ago
The politicians and toothlessness of the regulatory agencies is a direct result of the electorate. The electorate likes these sorts of outcomes, and any politician that goes against them will find themselves either primaried or drummed out of office.
bendbro · 5 years ago
Nobody will take that contract. Any mistake in the government specifying what they want you to build can be politically manipulated into a legal matter with some probability you'll end up in prison.
pc86 · 5 years ago
At what point do you feel the developers - the ones who actually wrote the code - should be held legally responsible for that code's execution?
mattmcknight · 5 years ago
I dropped this in a comment elsewhere in this discussion, but also makes sense here...

I find the government "requirements" process tends to create situations like this. Rather than build flexible software that puts some degree of trust in the person using it, they tend to overspecify the current bureaucratic process. In many cases, the person pushing for the software is looking to use software to enforce bureaucratic control that they have been unable to otherwise exercise, with the effect of the people the project initiator wants to use the software simply working around it. They then institute all sorts of punishments and controls to insure it must be used. This then results in the kind of insane situation we have here, where you can't do something perfectly legal because "computer says no".

frompdx · 5 years ago
the person pushing for the software is looking to use software to enforce bureaucratic control that they have been unable to otherwise exercise

This is frequently my observation as well. In the process of creating stricter control the bureaucrat increases the the power of their bureaucracy while shifting the blame for any problems to a faceless entity.

They then institute all sorts of punishments and controls to insure it must be used.

This leads me to one of my primary frustrations with the bureaucratization of our lives. Severe consequences are attached to low stakes situations and rational individuals who see the harm caused by the situation are rendered powerless to make changes.

indymike · 5 years ago
In this case, the requirements should be as simple as implement the law.
hodgesrm · 5 years ago
> It is also a perfect example of how software is used to create increasingly more elaborate and faceless bureaucracies that force individuals to spend more and more time contending with them.

You are attacking the wrong target. It's the government that's broken. This kind of outrage can happen just as easily with pencil and paper. The root cause is the lack of accountability and desire to make the government function better.

1-more · 5 years ago
But they can be undone with pencil and paper too. The footgun of automation here can only be undone with either a really good patch (git commit -m 'finally finally works for real this time') or a lot of pencil and paper work that's slower than the processes that caused the problem.

I'll note that this isn't the first time that people have said "well its the algorithm" when they were responsible. The example that springs to mind is bail risk assessments. You're very correct in that there are people making real decisions that are very cruel here. The machines give them something to hide behind.

frompdx · 5 years ago
Except software allows a scale and efficiency that is impossible with pencil and paper while also creating an ideal scapegoat. Software is being used to avoid accountability at a scale much greater than what was possible with manual process.
underwater · 5 years ago
How is this a government problem? People frequently lose access to social network accounts and email because of broken algorithms. Google can blacklist a business and send it broke. Insurance companies, credit bureaus and banks can make a wrong decision and deny credit.
Tepix · 5 years ago
Corporations have similar issues. Just look at the biased image recognition technology that FAANG release
lbriner · 5 years ago
Software that infringes on the public (even if they are criminals) as opposed to software that people can opt to use or not, needs to have a very serious question asked at design time: If the software produces an incorrect result, what mechanism exists to override it/audit it/provide damages etc.

The fact people are not asking that is worrying. I understand why the system was not designed to do something that happened later (even if it could have been reasonably foreseen) but the fact that it was implemented with no override is really the scandal.

I don't know whether this comes down to an amount of power that exists in a Governor that means the rest of the organisation can't say, "sorry Guv, but we can't do this because the software wasn't written to". If TV is to be believed, Governors want things done yesterday and you worry about the problems.

Enginerrrd · 5 years ago
As someone with a civil engineering background:

This right here is the difference between conventional engineering disciplines where designs require a Stamp from an Engineer of Record who takes on personal responsibility in the event of design failures vs. the current discipline of software engineering.

There's a big difference between a software developer and a software engineer, and I think that difference should be codified with a licensure and a stamp like it is in every other engineering field in the states.

Software like this ought to require a stamp.

A decent analogy is the environmental work I've done. When we come up with solutions and mitigations to environmental problems, like software, we can't always predict the result because of the complexities involved. So we stamp a design, but we, or the agencies responsible for allowing the project often specify additional monitoring or other stipulations with very specific performance guidelines. It's a flexible system and possible to adapt to, but there are real consequences and fines when targets aren't met. When bad things happen, the specifics of what went wrong and why are very relevant and the engineer may be to blame, or the owner/site manager, or the contractor who did the work, or sometimes no one is to be blamed but the agencies are able to say: "Hey this isn't working and needs to be addressed, do it by this date or else."

In engineering, there's an enormous amount of public trust given to engineered designs. The engineer takes personal responsibility for that public trust that a building or bridge isn't going to fall down. And if you're negligent, it's a BFD.

Given the current level of public trust that we are putting into software systems, it's crazy to me that we haven't adopted a similar system.

frompdx · 5 years ago
I have a very cynical take. Probably too cynical. The ability to shift blame to software as opposed to the humans responsible for administering a bureaucracy is exactly what makes it so appealing. The question is ignored intentionally.
PragmaticPulp · 5 years ago
> The computer is right in the sense that its decision cannot be undone, and solely to blame since no human can undo its edict or be held accountable, apparently.

This is why penalties are such an important part of the feedback loop. Obviously we can't go back in time and restore someone's phone privileges, but we can award monetary damages for the mistake.

Monetary damages alone won't discourage this behavior, though, as ultimately taxpayers foot the bill. There also must be some degree of accountability for those in charge of the system. Software can't become a tool for dodging accountability. Those in charge of implementing the software, providing the inputs, and managing the outputs must be held accountable for related mistakes.

> There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all.

Few Ask HN questions get many responses. This is also a loaded question, as HN is notorious for nit-picking every response and putting too much emphasis on the downsides. For example, I know farmers who have increased their farm productivity massively using modern hardware and software. However, if I posted that it would inevitably draw concerns about replacing human jobs, right-to-repair issues, and other issues surrounding the space. The world is definitely better off for having more efficient and productive farming techniques, freeing most of us up to do things other than farm.

However, all new advances bring a different set of problems. Instead of trying to force everything into broad categories of better or worse I think it's important to acknowledge that technology makes the world different. Different is a combination of better and worse. The modern world has different problems than we did 100 years ago, but given the choice I wouldn't choose to roll back to the pre-computer era.

> It certainly seems that way reading articles like this.

Both news and social media have a strong bias toward articles that spark anger or outrage. For me, the whole world stops feeling like a dumpster fire when I disconnect from news and social media for a while. I'm looking forward to the post-COVID era where we can get back to interacting with each other in person rather than gathering around a constant stream of negative stories on social media.

frompdx · 5 years ago
Both news and social media have a strong bias toward articles that spark anger or outrage.

Absolutely, and I agree that disconnecting can have positive benefits. On the other hand, at least for me personally, covid has disrupted the mechanisms that normally prevent in depth observation. It has given me time to read books I normally would not have read because that time went to things like waiting for my car to warm up so I can get to work on time, commuting, going out to lunch with co-workers, and going out for drinks with co-workers, friends, and family.

What is described in the article is outrageous. My concerns about bureaucracy and software's role in enabling it, on the other hand, have developed separately because I have the time to consider it.

ehnto · 5 years ago
It's such a hard question to answer, because software doesn't exist in a vacuum. Hopefully this example is relevant:

You're a software developer maintaining an eCommerce platform, on the one hand your platform helps perpetuate low margin and wasteful consumerism, on the other hand your software enables small businesses to compete in the new online world.

Consumerism is bad, but commerce is as old as civilization and supports all of our lifestyles, so on a macro level you're in a tough spot. You're a talented developer putting their skills to work building something the community needs, I personally think that means you're doing good work in the context of your society, but it is difficult to say if it's making the world a better place.

Social media is the same. On the one hand, it connects family and friends, on the other it drives narcissisms, consumerism and misinformation.

You almost have to try and calculate the "Net Good" or "Net Bad" of a type of software and see how the cards fall. For social media I would suggest that it's currently in a "Net Bad" situation, causing more harm than good for example.

malandrew · 5 years ago
> The inmate who lost their phone privileges for 30 days is an example. They did nothing wrong but the computer says so and nothing can be done. The computer is right in the sense that its decision cannot be undone, and solely to blame since no human can undo its edict or be held accountable, apparently. It is tragic and absurd.

All government software should be open source and anyone should be able to investigate the code and submit bug reports, including inmates. If they know there is something wrong, they have a lot of time on their hands to learn a useful skill to fix these issues.

The government should then not be allowed to close a bug as wontfix or invalid without approval from other citizen watchdogs verifying if a bug report is legitimate.

frompdx · 5 years ago
I agree with what you are proposing in principle. However, the notion that it is up to each individual to combat the system when it has wronged them while they languish in some kind of bureaucratic limbo is one of the core sicknesses of our system. Apart from having direct access to the source code and the ability to make pull requests, that is exactly what is happening here. The bureaucrats involved know there is a problem but are leaving it up to individual inmates and their advocates, both inside and outside the system, to sort it out.
mewpmewp2 · 5 years ago
I'd argue that software like this has saved people from having to do millions of years worth of mundane work. This news is essentially like a traffic accident. Doesn't mean vehicles in general haven't benefitted the human experience. The fact that it is news worthy is evidence that it doesn't happen too often.
rectang · 5 years ago
> making the world a better place

For a large segment of the US electorate, anything that inflicts pain on "bad people" is "making the world a better place".

If the software was causing prisoners to be released early, most US voters would be up in arms. But if they're being held too long, the calculus is different. In software terms, for many Americans, a "tough on crime" outcome is a "feature not a bug".

bondolo · 5 years ago
Little Britain's recurring bit Computer says "No", has always been a great illustration of this point. https://www.youtube.com/watch?v=0n_Ty_72Qds
6gvONxR4sf7o · 5 years ago
A big problem is that while we might improve the typical use with software, the failure mode is generally ignored and swept under the rug. See google's customer service. You can speed up and improve the average case a thousand times more, driving costs down by maybe a thousand times, or you can bring costs down like 2x but keep the benefits of manual, person centric failure recovery. Even then, non-automation doesn't make it "human" in the sense we all want. A rep in a call center who is only allowed to follow the playbook almost might as well be an automaton for all the freedom they have. Are faceless economies of scale and the bureaucracies they bring the root issue?
davidkhess · 5 years ago
What if the root cause is the ever increasing complexity that software is trying to manage? At all levels (legislature, management, bureaucrats, programming languages, developers, testers, users, subjects) we are creating more and more complex situations we ask software and the institutions that produce it to manage for us.

But as the complexity goes up and the number of these complex situations increases, are we reaching a point where we outstrip the amount of money, talent and experience our institutions would need to deliver solutions to successfully manage them?

With our resources and intelligence as a species being capped, it seems at some point this is inevitable.

alephnil · 5 years ago
Yes, software can be used as a cover for abuse like in this case, but that happens because some people in power let it happen. For other pieces of software that have consequences for people with more power than prisoners, the society will not allow failures to happen. I only need to mention the MCAS software of Boeing 737 Max for a counterexample.

Software does not have its own will. Software is only allowed to make decisions on our behalf because we let it do so.

frompdx · 5 years ago
It is interesting that you bring up the 737 Max. I was actually talking the 737 Max in the context of software being both infallible and the perfect scapegoat with someone last week. The 737 Max is an example of how software was believed to be infallible (it wasn't) and was ultimately the scapegoat for a design flaw. That doesn't mean the 737 Max was something that happened intentionally. However, when the time came to assign blame fingers began pointing at the software.

I do agree that software has no will. It is a tool for facilitating our will for better or worse.

majormajor · 5 years ago
The lack of desire to get things right throughout the bureaucracy is the problem. The software is just a mechanism. Other organizations that actually care figure out ways to get things right even when the software has issues.

You can see in the film Brazil, from 35 years ago, that this was already a problem and concern even without modern software.

dTal · 5 years ago
Much older than that. The blackly humorous 1965 short story "Computers Don't Argue" by Gordon R. Dickson is pretty much the definitive "software as a bureaucracy" story. No spoilers - it's short and well worth it:

https://www.atariarchives.org/bcc2/showpage.php?page=133

asdfasgasdgasdg · 5 years ago
> Is this because for the most part our efforts in producing software are actually doing the opposite? It certainly seems that way reading articles like this.

I think the most likely explanation is just that people didn't see the question or weren't interested in having the discussion. Most people believe the work they're doing is at worst neutral. A less likely candidate for the reason (but still more likely than your guess) is that people didn't want to be subjected to unfounded criticism of their work from people who don't know anything about it.

austincheney · 5 years ago
The best solution to the problem is to hold developers personally liable for the software they write, as well as the owners. That could mean criminal penalties for negligent violations of industry standards and processes but will mostly result in civil penalties.

The second and third order consequences is that developers will insulate themselves behind licensing and proofs of practice like every other industry.

Until people actually advocate for real penalties for such harmful violations they don’t care. All their temporary whining and crying is just blowing smoke up our asses.

eunos · 5 years ago
I am not sure how software bug is the exclusive enabler since it is plausible as well for administration bug to occur with pen and paper along with the compliant warden.
frompdx · 5 years ago
It's not that software is the exclusive enabler. It is that software is the ideal enabler because of its ability to create a truly faceless entity that seems to exist outside the power of even those who administer it. Of course these issues were always possible without software. Software is just so much more efficient and useful for creating these kinds of issues because it can scale and because it can be the scapegoat.
dmead · 5 years ago
most of us have jobs that service the revenue streams of rich owners. what did you think was going to happen?
bitL · 5 years ago
> if the software we are building is making the world a better place

No, it's now all about "extracting value", "rent seeking", "subscriptions", "censorship", "monopoly" and "control". We got bribed by FAANG and this is the consequence.

Bakary · 5 years ago
Bribed implies the natural course of the tech worker would be altruistic, which is quite the assumption
amelius · 5 years ago
There is plenty of useful software. For example: scientific software.
frompdx · 5 years ago
That's true, and I am not arguing that useful software does not exist. Instead, a lot of energy producing software is often not useful, or useful in perverse ways.
dustingetz · 5 years ago
software is a tool used by people. You are measuring America.
pm90 · 5 years ago
Software is just a tool, it can be used to build good or bad things.

It would be hard to see this in e.g. Scandinavian countries, where incarceration is seen as rehabilitative rather than punitive.

In the US, racial discrimination, free market extremism along with "tough on crime" laws have created unimaginably cruel systems; together with private prisons, the goal has been on cutting costs rather than rehabilitating prisoners. Software is just a tool to further that goal.

frompdx · 5 years ago
Indeed, software is an enabling technology and is morally ambivalent just like any other tool. A machine tool can make a medical device to save lives or military device to destroy lives. At the heart of the issue is the intricate web of institutional mouse traps designed to convert low stakes issues into serious offenses. For example, a person who does does respond to a citation for expired vehicle tags is now ensured in a mechanism that turns a minor issue related to tax collection into a real criminal offense. It is up to humans to make these things so.

I brought up the Ask HN question mostly because I felt the lack of replies were a silent acknowledgement of the realities of most software endeavors. That they are not making the world a better place. Most aren't going out of their way to make it worse. Probably, it isn't even a consideration.

jerry1979 · 5 years ago
I don't think tools cancel themselves out, and I suspect that nothing "is just" anything.

Even if ideas like "the medium is the message" are partially true and then just partially applicable, that should give us pause when we try to cross out tools in our morality equations.

- https://en.wikipedia.org/wiki/The_medium_is_the_message

P_I_Staker · 5 years ago
Is rehabilitation not the main goal in the USA? We call the places correctional facilities, and do other things that are ostensibly there to correct behavior and prevent recidivism. In fact, it seems like most of the things that prevent people from being successful are unintended side effects.

Here's a thought: Why do we permit private companies to not hire ex-cons? Why do you just get to decide that you don't want to hold up your civic responsibilities like that? Who wants to work with someone that used to be violent maniac, sleazy thief, or worse?

I agree about cost cutting measures and the criminal justice industrial complex. Still we have bigger issues around crime and reconciliation that prevent us from making progress. To be honest, I have trouble understanding how we're going to change, unless the average person can live with someone ruining their life, then spending "only" a year or so in prison and moving on to be successful in a decent paying job.

We still find that outrageous in the US, and it's going to be very tough to make progress that way. It's not about making something "a goal", especially in a country like the US, it's about convincing the wealthy and powerful class to do anything at all about it and stop making it worse.