> “When they legislate these things, they need to be appropriating enough money to make sure they work,” a source said. They estimated fixing the SB1310 bug would take roughly 2,000 additional programming hours.
40 hours a week times 52 weeks is 2080 hours. Subtract a few weeks for vacations and holidays, and you get a little less that 2000 hours. So, basically, this is a little more than one programmer-year of effort if the estimate is in the right ballpark.
It's gross that the decision not to fix this carries an apparent implicit economic calculation that one programmer-year is more valuable than the freedom that is being denied to an unknown number of people whom society deems less important. (Granted the actual situation is more complicated and the state is constrained by their contract with the vendor, which we can reasonably guess is going to charge as much as they can contractually get away with rather than the programmer's actual salary cost.)
At least the Department of Corrections has assigned people to do the calculations manually. That's better, but it sounds like they just don't have enough people on it to keep up.
That is horse shit and we all know it. What bug takes 2k hours???? thats 250 work days. jesus christ, if I took that long to fix a bug, fire me. And yes, I'm also talking about time to test, write/fix unit tests, write/fix integration tests, releasing into production, and data conversion.
Look at the description of the issue. It's really less of a bug and more of a feature request, in the sense that the legislature changed the rules for how "earned release credits" could be calculated. All of the details are here: https://corrections.az.gov/sites/default/files/documents/PDF... .
Previously, it seems like there was a single standard, applied universally: 1 day of earned release credit for every 6 days served. The new rules have many more inputs, with lots of caveats: only certain offenses are eligible, and the inmate can not have been convicted of some other types of offenses, and the inmate must have completed some specific courses, and the inmate can't have previously been convicted of certain felonies.
The 2k hours may very well be excessive, and I don't care if it takes 20k hours, it means they should mothball their software and do it manually if that's the case, but just calling it a "bug" is misleading IMO.
> It's gross that the decision not to fix this carries an apparent implicit economic calculation
Spending money will remain economic decision until we can have government agencies fueled by the righteous indignation of their critics rather than having a line item added to their budget. Until you can convert that indignation into legal tender, agencies will remain subject to old fashioned accounting constraints.
The onerous budget item we are talking about here is a feature that multiply days sentenced by 0.7 if the inmate completes one checkbox item. You know, just to keep things in perspective.
I understand that there's a lot of work that could go into this sort of thing (mocks, accessibility, testing)... but is 2000 hours really a defensible number? It sounds like there's new per-inmate data and calculations for inmate eligibility and sentencing credit. But 2000 hours worth of work? Even sand-bagging it sounds like way too much.
Of course you're assuming they have anyone left at the company that understand the software. So many times the original team that wrote it is gone, and there is a plate of spaghetti left for the next group to figure out.
> an apparent implicit economic calculation that one programmer-year is more valuable than the freedom that is being denied to an unknown number of people whom society deems less important
I’m surprised this doesn’t create a massive liability for the state.
I've had the opportunity of participating on the design phase of an application to control the distribution of ostomy bags for a network of public hospitals. I was a bit shocked when I knew that the decision-makers were about to cut functionality intended to provide workarounds in case of system failures, so that employees could keep delivering bags upon request.
That would basically result on patients not getting their ostomy bags on time, and I can't even imagine what would follow afterwards. What would be the reactions of patients and their relatives, what levels of stress would hospitals' employees would be subjected to, and so on.
I left the company some months after that, and I don't know what was the final decision, but they'd been warned.
Maybe one day some set of ethical standards will be considered non-functional requirements as important as robustness, security and others.
With technicians being responsible for warning their managers, managers being responsible for assessing risks and documenting their decisions, everything being made transparently and everybody being accountable.
Yeah, I'd assume this would be resolved real quick if the state (or the contractor responsible for the software) had to pay out, say, $100 per inmate per day that they improperly spent in jail past the end of their sentence.
That this problem is allowed to persist seems like an indication that the people in charge believe that prisoners have a low probability of successfully suing the state for damages.
Wow. I know firsthand from family how this can severely destroy someone's mental health in what may not be so obvious; it is extremely heavy on someone every moment past the first hour they go past their release time, then the first day followed by a variety of things that will then be taken advantage of by other inmate and guards while one's defenses are down. The fun poked at by other jealous inmates and cruel guards constantly will also weigh down hard on another human being. Arizona penal system puts you into almost always into very nasty and dangerous places of incarceration. frompdx made a statement that truly made my gut feel as if I was at the top of a roller coaster I did not want to get on in the first place.
When I wanted to have compiled [1] financials, PriceWaterhouseCoopers told me to pick a recognized accounting system, then change the company's business processes to match that. They said absolutely not to go the other way, to try to customize any software to match our business.
I think about that every time I read about another government (or private!) company that wastes tens or hundreds of million of dollars (or euros or pounds) on custom software.
It seems like there should be 1, 2, or 3 DMV programs. The same for building codes, tax codes, etc. And prison software. You can be more like Massachusetts or Mississippi or Montana (hypothetical examples) but pick one and harmonize with it.
1: compiled is the lowest of 3 standards that outside accountants can do; "reviewed" is higher and "audited" is the highest. Even at the compiled level they mailed out postcards to a certain number of customers asking if they were customers over the past year and had spent this much money. It was fairly easy for the acquiring company's outside accountants to review PWC's work and bring it up to audited standard.
This appears to me to be a terrible idea. In effect you would have private companies writing the laws of the land. "I'm sorry California you can't change your laws because it doesn't fit into the three options we have available at our preferred software vendor". Seems like the tail wagging the dog.
Login.gov cribbed off of the UK’s digital office that built a similar system. I believe that’s what OP was alluding too.
How many unemployment systems, prisoner tracking systems, DMV systems do you need? These are common components across governments.
Example: Login.gov now supports local and state government partners. Your constituent IAM needs can now be met by a federal team that is efficient and competent, instead of every city and state reinventing the wheel (poorly and in expensively).
It could go that way. But the idea is that Massachusetts might charge EVs extra license fees because they want to replace the lost gas taxes whereas Montana and Mississippi wouldn't. Massachusetts already has different and higher pollution regulations (typically based on California's).
Other states might want to do the same, although the fees would probably differ. So the idea is that 10 or 15 states cluster around one solution for a department, 20 for another, 10 for a third and the rest go their own way. The states would have a lot of power in being able to replace working solution A with B or C. So there's 3 or 4 DMV vendors, there's 3 or 4 unemployment vendors, some for contact tracing (my state of Oregon still hasn't implemented the Google/Apple tracing), and so on.
The current situation is that you know a potential replacement will be late and over budget, you just don't know exactly how bad it will be. And Accenture and IBM like it that way and are very adept at persuading the decision makers that they're very special snowflakes and can't use an off-the-shelf solution.
An alternative is to have the federal government offer the "federal choice" which states and local governments can choose to use instead of rolling out their own.
In states, counties and cities a lot of contracting basically has the purpose of pushing money to well connected people. They don’t want an efficient and cost effective solution.
I know somebody who audits municipalities. We did a graph that showed relations between different players. It’s basically just a big insider club of usually 20-40 people and families that give contracts to each other at the expense of the tax payer.
That's hard to do because all of those systems are intertwined. If you use the Montana DMV program, and you want people who get DUIs to have their driver's license suspended, now you have to use the Montana Penal System program. Except Montana's Penal System has a bunch of exceptions written into it for laws that can be either a misdemeanor or a felony, and they don't allow any time off for good behavior. So now are you going to adopt Montana's laws too? There's tax code stuff in there too, so I guess we're lumping in the Montana tax system as well.
I think the problem is that unlike our more notable branches, we don't hire experts in the field. I don't mean they're incompetent at technology, but that a problem like this really exists at the intersection of government and technology. We keep hiring general-purpose contractors to build things like this, and then we're shocked when it falls apart in the environment governments exist in.
We need companies that specialize in this intersection. Companies that can keep public sentiment in mind and build an architecture that's flexible in the places where society is. It's the same way that most of us in general purpose IT try to build systems that can adapt to changes in the IT landscape. Put it in Docker so we can run it on a cloud, on bare metal, on k8s and probably on whatever's next. Governments struggle to pivot like that due to funding (how do you argue for funding for features since you can't earn revenue?), and because a lot of it is legislated out of their control. Learning to read the public sentiment is just like us reading trends in a newsletter.
This advice appears based on deficiencies in programming however. Programs operate on algorithms to process data. When the programs or algorithms fail to be able to do so properly the program is at fault.
In your cases you have items like: accounting, building codes, tax codes, automobile codes, etc.
While it makes sense to try and harmonize with the general policies, every state, every municipality, and every business is going to have special cases. Even software has edge cases for protocol behaviors.
What would be nicer, imho, is if all of these laws were written in domain specific languages that specify the law and then the software could just pick up the definitions signed into law. Lawyers as they are feel like a combination of legal interpreters, combined with a combination of being red/blue security team members depending on what they are doing.
My dad actually created a (failed) startup in the early '00s that modeled immigration law in Prolog, enabling the creation of legally accurate forms and resolving complex legal queries. It was a good idea, it just failed due to infighting and mismanagement.
are there popular languages for implementing these types of DSLs?
FWIW this is actually (mostly) the case for building codes. The standard in the USA is the International Building Code (and other related code by the International Code Council), which each jurisdiction adopts into law and amends as needed for local conditions or practices. And these codes in turn reference other international standards specific to the knowledge domain.
> It seems like there should be 1, 2, or 3 DMV programs. The same for building codes, tax codes, etc. And prison software. You can be more like Massachusetts or Mississippi or Montana (hypothetical examples) but pick one and harmonize with it.
Kind of defeats the entire purpose of having states to start with.
If we want to make the US a centralized, unitary state, let's do that through the elected central government and not through deferral to IT contractors.
“Show me the incentives, I’ll show you the outcome.”
If corrections staff were held personally liable for these failures, or the local jurisdiction faced steep financial penalties, it wouldn’t happen. No liability, no responsibility.
A. A person commits unlawful imprisonment by knowingly restraining another person.
B. In any prosecution for unlawful imprisonment, it is a defense that:
1. The restraint was accomplished by a peace officer or detention officer acting in good faith in the lawful performance of his duty; or
2. The defendant is a relative of the person restrained and the defendant's sole intent is to assume lawful custody of that person and the restraint was accomplished without physical injury.
C. Unlawful imprisonment is a class 6 felony unless the victim is released voluntarily by the defendant without physical injury in a safe place before arrest in which case it is a class 1 misdemeanor.
D. For the purposes of this section, "detention officer" means a person other than an elected official who is employed by a county, city or town and who is responsible for the supervision, protection, care, custody or control of inmates in a county or municipal correctional institution. Detention officer does not include counselors or secretarial, clerical or professionally trained personnel.
You’re assuming that agents of the government are expected to follow their own laws. Those laws are for you and me, not our beknighted public servants.
Their list is of people eligible for a program that would give them an early release, so unless the inmate enrolls the prison would be acting in good faith. Almost like the law was intentionally worded to limit their liability.
> Isn't this clearly defined false imprisonment under Arizona law?
> Assumption being that a detention officer is not acting in good faith if they have a list of people who should no longer be detained under state law.
I agree with your premise and assertion, but I'm not sure that's exactly what's happening here. I'd like to preface this by saying I absolutely believe there need to be ramifications; I'm just not sure that it fits "clearly defined false imprisonment." I think a category would have to be added to the false imprisonment statute for "negligence" for this to be considered false imprisonment and let me tell you why:
From what I can tell, this article is talking about a couple of massive issues but the wrongful imprisonment bit is about a specific bug (SB1310) in ACIS that can't calculate an updated release date for inmates that complete special programs that award additional release credits as per an amendment signed into law in 2019. Since they can't automatically update a release date for individuals that have completed this program, they keep track of it manually. To me, the article doesn't read like they have a list of people who should be released but aren't being released because the software says so; from my very limited perspective it reads like there are certain programs an inmate can complete to earn extra release credits and since the system can't track these extra credits, the detention officers do it manually. I would imagine their manual process goes something like this:
1) Compile list of inmates that have earned extra release credits through the aforementioned release programming.
2) Select inmate from list, possibly in order of original release date, earliest first.
3) Calculate the amount of release credits they received from completion of the programming.
4) Calculate the total hours those credits equal.
5) Deduct hours from release date.
6) Manually update the release date in ACIS (likely requiring warden and/or judicial approval, but idk).
6a) Since ACIS now has the appropriate release date, the inmate will be processed for release now (if the date has passed) or as they normally would be.
6b) Remove inmate's name from list unless currently enrolled in early release programming, in which case they are moved to the bottom of the queue.
7) Lather, rinse, repeat.
Being denied release because of a software error would be hellish for both an inmate and their loved ones... But because it doesn't seem like they have an actual list of people that should have already been released but haven't been because the software made a critical oversight, I don't think it fits the legislation as it exists today for false imprisonment. The tool is broken so they've switched to manual calculation until someone more important decides it's worth fixing.
If we add negligence to the false imprisonment statute, I'd agree wholeheartedly! But IA[very_much]NAL, so I'll confess I don't really know anything about anything.
To color this even further: the hundreds of people who are illegally imprisoned are being held for drug or even just paraphernalia possession. The law that grants them credits explicitly excludes violent felons[1].
If my tax $ goes to it, it should have source available (excepting natsec). it would be nice to get some value out of it. If it's well written, I could learn how a large scale project works. If not, I can have something to petition and voice my concerns about, inform about vulns, etc.
I think there is still a genuine concern that open-source software allows bad people to find loopholes before the good people do. The last thing you want is someone finding a bug that allows a murderer to get released because the computer said-so.
I think it can be managed but it is a genuine concern nonetheless.
In best cases, the test cases are good and pass... and yet such errors will still abound.
Why? Because the spec for which the tests where written didn't include some contingency, for example with software that rigidly require certain steps to happen and doesn't provide a human-controlled override.
This is an outrage. It is also a perfect example of how software is used to create increasingly more elaborate and faceless bureaucracies that force individuals to spend more and more time contending with them. Somehow software has become the ultimate vehicle for bureaucratic violence. Software is simultaneously infallible and the perfect scapegoat. The inmate who lost their phone privileges for 30 days is an example. They did nothing wrong but the computer says so and nothing can be done. The computer is right in the sense that its decision cannot be undone, and solely to blame since no human can undo its edict or be held accountable, apparently. It is tragic and absurd.
There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all. Is this because for the most part our efforts in producing software are actually doing the opposite? It certainly seems that way reading articles like this.
> Instead of fixing the bug, department sources said employees are attempting to identify qualifying inmates manually... But sources say the department isn’t even scratching the surface of the entire number of eligible inmates. “The only prisoners that are getting into programming are the squeaky wheels,” a source said, “the ones who already know they qualify or people who have family members on the outside advocating for them.”
> In the meantime, Lamoreaux confirmed the “data is being calculated manually and then entered into the system.” Department sources said this means “someone is sitting there crunching numbers with a calculator and interpreting how each of the new laws that have been passed would impact an inmate.” “It makes me sick,” one source said, noting that even the most diligent employees are capable of making math errors that could result in additional months or years in prison for an inmate. “What the hell are we doing here? People’s lives are at stake.”
Comments like yours seem to glorify a pre-software world filled with manual entry. The reality is that manual entry is even more error-prone, bias-prone, with more people falling through the cracks.
If nothing else, software can be uniformly applied at a mass scale, and audited for any and all bugs. And faulty software can be exposed through leaks like the above, to expose and fix systemic problems. Whereas a world of manual entry simply ignores vast numbers of errors and biases which are extremely hard to detect/prove, and even then, can simply be scapegoated with some unlucky individuals, without any effort to fix systemically.
We know exactly how to fix it. Our cowardly politicians and toothless regulatory agencies are not up for the challenge.
For every piece of software that can directly and materially harm someone's life like this, there should be a chain of responsibility. And within that chain, there should be legal recourse and, in most cases, penal consequences, especially in the case of inadequate software quality/testing/validation, should the software fail to perform its task correctly. Bonus side effect, software quality will go up across the board in the industry.
I dropped this in a comment elsewhere in this discussion, but also makes sense here...
I find the government "requirements" process tends to create situations like this. Rather than build flexible software that puts some degree of trust in the person using it, they tend to overspecify the current bureaucratic process. In many cases, the person pushing for the software is looking to use software to enforce bureaucratic control that they have been unable to otherwise exercise, with the effect of the people the project initiator wants to use the software simply working around it. They then institute all sorts of punishments and controls to insure it must be used. This then results in the kind of insane situation we have here, where you can't do something perfectly legal because "computer says no".
> It is also a perfect example of how software is used to create increasingly more elaborate and faceless bureaucracies that force individuals to spend more and more time contending with them.
You are attacking the wrong target. It's the government that's broken. This kind of outrage can happen just as easily with pencil and paper. The root cause is the lack of accountability and desire to make the government function better.
Software that infringes on the public (even if they are criminals) as opposed to software that people can opt to use or not, needs to have a very serious question asked at design time: If the software produces an incorrect result, what mechanism exists to override it/audit it/provide damages etc.
The fact people are not asking that is worrying. I understand why the system was not designed to do something that happened later (even if it could have been reasonably foreseen) but the fact that it was implemented with no override is really the scandal.
I don't know whether this comes down to an amount of power that exists in a Governor that means the rest of the organisation can't say, "sorry Guv, but we can't do this because the software wasn't written to". If TV is to be believed, Governors want things done yesterday and you worry about the problems.
> The computer is right in the sense that its decision cannot be undone, and solely to blame since no human can undo its edict or be held accountable, apparently.
This is why penalties are such an important part of the feedback loop. Obviously we can't go back in time and restore someone's phone privileges, but we can award monetary damages for the mistake.
Monetary damages alone won't discourage this behavior, though, as ultimately taxpayers foot the bill. There also must be some degree of accountability for those in charge of the system. Software can't become a tool for dodging accountability. Those in charge of implementing the software, providing the inputs, and managing the outputs must be held accountable for related mistakes.
> There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all.
Few Ask HN questions get many responses. This is also a loaded question, as HN is notorious for nit-picking every response and putting too much emphasis on the downsides. For example, I know farmers who have increased their farm productivity massively using modern hardware and software. However, if I posted that it would inevitably draw concerns about replacing human jobs, right-to-repair issues, and other issues surrounding the space. The world is definitely better off for having more efficient and productive farming techniques, freeing most of us up to do things other than farm.
However, all new advances bring a different set of problems. Instead of trying to force everything into broad categories of better or worse I think it's important to acknowledge that technology makes the world different. Different is a combination of better and worse. The modern world has different problems than we did 100 years ago, but given the choice I wouldn't choose to roll back to the pre-computer era.
> It certainly seems that way reading articles like this.
Both news and social media have a strong bias toward articles that spark anger or outrage. For me, the whole world stops feeling like a dumpster fire when I disconnect from news and social media for a while. I'm looking forward to the post-COVID era where we can get back to interacting with each other in person rather than gathering around a constant stream of negative stories on social media.
It's such a hard question to answer, because software doesn't exist in a vacuum. Hopefully this example is relevant:
You're a software developer maintaining an eCommerce platform, on the one hand your platform helps perpetuate low margin and wasteful consumerism, on the other hand your software enables small businesses to compete in the new online world.
Consumerism is bad, but commerce is as old as civilization and supports all of our lifestyles, so on a macro level you're in a tough spot. You're a talented developer putting their skills to work building something the community needs, I personally think that means you're doing good work in the context of your society, but it is difficult to say if it's making the world a better place.
Social media is the same. On the one hand, it connects family and friends, on the other it drives narcissisms, consumerism and misinformation.
You almost have to try and calculate the "Net Good" or "Net Bad" of a type of software and see how the cards fall. For social media I would suggest that it's currently in a "Net Bad" situation, causing more harm than good for example.
> The inmate who lost their phone privileges for 30 days is an example. They did nothing wrong but the computer says so and nothing can be done. The computer is right in the sense that its decision cannot be undone, and solely to blame since no human can undo its edict or be held accountable, apparently. It is tragic and absurd.
All government software should be open source and anyone should be able to investigate the code and submit bug reports, including inmates. If they know there is something wrong, they have a lot of time on their hands to learn a useful skill to fix these issues.
The government should then not be allowed to close a bug as wontfix or invalid without approval from other citizen watchdogs verifying if a bug report is legitimate.
I'd argue that software like this has saved people from having to do millions of years worth of mundane work. This news is essentially like a traffic accident. Doesn't mean vehicles in general haven't benefitted the human experience. The fact that it is news worthy is evidence that it doesn't happen too often.
For a large segment of the US electorate, anything that inflicts pain on "bad people" is "making the world a better place".
If the software was causing prisoners to be released early, most US voters would be up in arms. But if they're being held too long, the calculus is different. In software terms, for many Americans, a "tough on crime" outcome is a "feature not a bug".
A big problem is that while we might improve the typical use with software, the failure mode is generally ignored and swept under the rug. See google's customer service. You can speed up and improve the average case a thousand times more, driving costs down by maybe a thousand times, or you can bring costs down like 2x but keep the benefits of manual, person centric failure recovery. Even then, non-automation doesn't make it "human" in the sense we all want. A rep in a call center who is only allowed to follow the playbook almost might as well be an automaton for all the freedom they have. Are faceless economies of scale and the bureaucracies they bring the root issue?
What if the root cause is the ever increasing complexity that software is trying to manage? At all levels (legislature, management, bureaucrats, programming languages, developers, testers, users, subjects) we are creating more and more complex situations we ask software and the institutions that produce it to manage for us.
But as the complexity goes up and the number of these complex situations increases, are we reaching a point where we outstrip the amount of money, talent and experience our institutions would need to deliver solutions to successfully manage them?
With our resources and intelligence as a species being capped, it seems at some point this is inevitable.
Yes, software can be used as a cover for abuse like in this case, but that happens because some people in power let it happen. For other pieces of software that have consequences for people with more power than prisoners, the society will not allow failures to happen. I only need to mention the MCAS software of Boeing 737 Max for a counterexample.
Software does not have its own will. Software is only allowed to make decisions on our behalf because we let it do so.
The lack of desire to get things right throughout the bureaucracy is the problem. The software is just a mechanism. Other organizations that actually care figure out ways to get things right even when the software has issues.
You can see in the film Brazil, from 35 years ago, that this was already a problem and concern even without modern software.
> Is this because for the most part our efforts in producing software are actually doing the opposite? It certainly seems that way reading articles like this.
I think the most likely explanation is just that people didn't see the question or weren't interested in having the discussion. Most people believe the work they're doing is at worst neutral. A less likely candidate for the reason (but still more likely than your guess) is that people didn't want to be subjected to unfounded criticism of their work from people who don't know anything about it.
The best solution to the problem is to hold developers personally liable for the software they write, as well as the owners. That could mean criminal penalties for negligent violations of industry standards and processes but will mostly result in civil penalties.
The second and third order consequences is that developers will insulate themselves behind licensing and proofs of practice like every other industry.
Until people actually advocate for real penalties for such harmful violations they don’t care. All their temporary whining and crying is just blowing smoke up our asses.
I am not sure how software bug is the exclusive enabler since it is plausible as well for administration bug to occur with pen and paper along with the compliant warden.
> if the software we are building is making the world a better place
No, it's now all about "extracting value", "rent seeking", "subscriptions", "censorship", "monopoly" and "control". We got bribed by FAANG and this is the consequence.
Software is just a tool, it can be used to build good or bad things.
It would be hard to see this in e.g. Scandinavian countries, where incarceration is seen as rehabilitative rather than punitive.
In the US, racial discrimination, free market extremism along with "tough on crime" laws have created unimaginably cruel systems; together with private prisons, the goal has been on cutting costs rather than rehabilitating prisoners. Software is just a tool to further that goal.
40 hours a week times 52 weeks is 2080 hours. Subtract a few weeks for vacations and holidays, and you get a little less that 2000 hours. So, basically, this is a little more than one programmer-year of effort if the estimate is in the right ballpark.
It's gross that the decision not to fix this carries an apparent implicit economic calculation that one programmer-year is more valuable than the freedom that is being denied to an unknown number of people whom society deems less important. (Granted the actual situation is more complicated and the state is constrained by their contract with the vendor, which we can reasonably guess is going to charge as much as they can contractually get away with rather than the programmer's actual salary cost.)
At least the Department of Corrections has assigned people to do the calculations manually. That's better, but it sounds like they just don't have enough people on it to keep up.
Previously, it seems like there was a single standard, applied universally: 1 day of earned release credit for every 6 days served. The new rules have many more inputs, with lots of caveats: only certain offenses are eligible, and the inmate can not have been convicted of some other types of offenses, and the inmate must have completed some specific courses, and the inmate can't have previously been convicted of certain felonies.
The 2k hours may very well be excessive, and I don't care if it takes 20k hours, it means they should mothball their software and do it manually if that's the case, but just calling it a "bug" is misleading IMO.
You'd have the private prisons and the prison guards union climbing up everyone's posteriors.
Spending money will remain economic decision until we can have government agencies fueled by the righteous indignation of their critics rather than having a line item added to their budget. Until you can convert that indignation into legal tender, agencies will remain subject to old fashioned accounting constraints.
Seems plausible it could balloon.
I’m surprised this doesn’t create a massive liability for the state.
That would basically result on patients not getting their ostomy bags on time, and I can't even imagine what would follow afterwards. What would be the reactions of patients and their relatives, what levels of stress would hospitals' employees would be subjected to, and so on.
I left the company some months after that, and I don't know what was the final decision, but they'd been warned.
Maybe one day some set of ethical standards will be considered non-functional requirements as important as robustness, security and others.
With technicians being responsible for warning their managers, managers being responsible for assessing risks and documenting their decisions, everything being made transparently and everybody being accountable.
That this problem is allowed to persist seems like an indication that the people in charge believe that prisoners have a low probability of successfully suing the state for damages.
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
I think about that every time I read about another government (or private!) company that wastes tens or hundreds of million of dollars (or euros or pounds) on custom software.
It seems like there should be 1, 2, or 3 DMV programs. The same for building codes, tax codes, etc. And prison software. You can be more like Massachusetts or Mississippi or Montana (hypothetical examples) but pick one and harmonize with it.
1: compiled is the lowest of 3 standards that outside accountants can do; "reviewed" is higher and "audited" is the highest. Even at the compiled level they mailed out postcards to a certain number of customers asking if they were customers over the past year and had spent this much money. It was fairly easy for the acquiring company's outside accountants to review PWC's work and bring it up to audited standard.
How many unemployment systems, prisoner tracking systems, DMV systems do you need? These are common components across governments.
Example: Login.gov now supports local and state government partners. Your constituent IAM needs can now be met by a federal team that is efficient and competent, instead of every city and state reinventing the wheel (poorly and in expensively).
Other states might want to do the same, although the fees would probably differ. So the idea is that 10 or 15 states cluster around one solution for a department, 20 for another, 10 for a third and the rest go their own way. The states would have a lot of power in being able to replace working solution A with B or C. So there's 3 or 4 DMV vendors, there's 3 or 4 unemployment vendors, some for contact tracing (my state of Oregon still hasn't implemented the Google/Apple tracing), and so on.
The current situation is that you know a potential replacement will be late and over budget, you just don't know exactly how bad it will be. And Accenture and IBM like it that way and are very adept at persuading the decision makers that they're very special snowflakes and can't use an off-the-shelf solution.
I know somebody who audits municipalities. We did a graph that showed relations between different players. It’s basically just a big insider club of usually 20-40 people and families that give contracts to each other at the expense of the tax payer.
I think the problem is that unlike our more notable branches, we don't hire experts in the field. I don't mean they're incompetent at technology, but that a problem like this really exists at the intersection of government and technology. We keep hiring general-purpose contractors to build things like this, and then we're shocked when it falls apart in the environment governments exist in.
We need companies that specialize in this intersection. Companies that can keep public sentiment in mind and build an architecture that's flexible in the places where society is. It's the same way that most of us in general purpose IT try to build systems that can adapt to changes in the IT landscape. Put it in Docker so we can run it on a cloud, on bare metal, on k8s and probably on whatever's next. Governments struggle to pivot like that due to funding (how do you argue for funding for features since you can't earn revenue?), and because a lot of it is legislated out of their control. Learning to read the public sentiment is just like us reading trends in a newsletter.
In your cases you have items like: accounting, building codes, tax codes, automobile codes, etc.
While it makes sense to try and harmonize with the general policies, every state, every municipality, and every business is going to have special cases. Even software has edge cases for protocol behaviors.
What would be nicer, imho, is if all of these laws were written in domain specific languages that specify the law and then the software could just pick up the definitions signed into law. Lawyers as they are feel like a combination of legal interpreters, combined with a combination of being red/blue security team members depending on what they are doing.
are there popular languages for implementing these types of DSLs?
Kind of defeats the entire purpose of having states to start with.
If we want to make the US a centralized, unitary state, let's do that through the elected central government and not through deferral to IT contractors.
All of a sudden that person could no longer make calls for 30 days, and they did nothing wrong to get that.
If corrections staff were held personally liable for these failures, or the local jurisdiction faced steep financial penalties, it wouldn’t happen. No liability, no responsibility.
Loading comment...
Loading comment...
Here's the relevant statute:
13-1303. Unlawful imprisonment; classification; definition
A. A person commits unlawful imprisonment by knowingly restraining another person.
B. In any prosecution for unlawful imprisonment, it is a defense that:
1. The restraint was accomplished by a peace officer or detention officer acting in good faith in the lawful performance of his duty; or
2. The defendant is a relative of the person restrained and the defendant's sole intent is to assume lawful custody of that person and the restraint was accomplished without physical injury.
C. Unlawful imprisonment is a class 6 felony unless the victim is released voluntarily by the defendant without physical injury in a safe place before arrest in which case it is a class 1 misdemeanor.
D. For the purposes of this section, "detention officer" means a person other than an elected official who is employed by a county, city or town and who is responsible for the supervision, protection, care, custody or control of inmates in a county or municipal correctional institution. Detention officer does not include counselors or secretarial, clerical or professionally trained personnel.
https://www.azleg.gov/ars/13/01303.htm
Assumption being that a detention officer is not acting in good faith if they have a list of people who should no longer be detained under state law.
Deleted Comment
> Assumption being that a detention officer is not acting in good faith if they have a list of people who should no longer be detained under state law.
I agree with your premise and assertion, but I'm not sure that's exactly what's happening here. I'd like to preface this by saying I absolutely believe there need to be ramifications; I'm just not sure that it fits "clearly defined false imprisonment." I think a category would have to be added to the false imprisonment statute for "negligence" for this to be considered false imprisonment and let me tell you why:
From what I can tell, this article is talking about a couple of massive issues but the wrongful imprisonment bit is about a specific bug (SB1310) in ACIS that can't calculate an updated release date for inmates that complete special programs that award additional release credits as per an amendment signed into law in 2019. Since they can't automatically update a release date for individuals that have completed this program, they keep track of it manually. To me, the article doesn't read like they have a list of people who should be released but aren't being released because the software says so; from my very limited perspective it reads like there are certain programs an inmate can complete to earn extra release credits and since the system can't track these extra credits, the detention officers do it manually. I would imagine their manual process goes something like this:
1) Compile list of inmates that have earned extra release credits through the aforementioned release programming.
2) Select inmate from list, possibly in order of original release date, earliest first.
3) Calculate the amount of release credits they received from completion of the programming.
4) Calculate the total hours those credits equal.
5) Deduct hours from release date.
6) Manually update the release date in ACIS (likely requiring warden and/or judicial approval, but idk).
6a) Since ACIS now has the appropriate release date, the inmate will be processed for release now (if the date has passed) or as they normally would be.
6b) Remove inmate's name from list unless currently enrolled in early release programming, in which case they are moved to the bottom of the queue.
7) Lather, rinse, repeat.
Being denied release because of a software error would be hellish for both an inmate and their loved ones... But because it doesn't seem like they have an actual list of people that should have already been released but haven't been because the software made a critical oversight, I don't think it fits the legislation as it exists today for false imprisonment. The tool is broken so they've switched to manual calculation until someone more important decides it's worth fixing.
If we add negligence to the false imprisonment statute, I'd agree wholeheartedly! But IA[very_much]NAL, so I'll confess I don't really know anything about anything.
EDIT: formatting
[1]: https://corrections.az.gov/sites/default/files/documents/PDF...
See also: employment security sites, cannabis track and trace, driving license, etc.
Some of these bugs cause direct financial harm to citizens and this one is much worse!
Show me the test cases! Show me the code!!
Loading comment...
https://www.azleg.gov/ARStitle/
I think it can be managed but it is a genuine concern nonetheless.
Loading comment...
Loading comment...
Why? Because the spec for which the tests where written didn't include some contingency, for example with software that rigidly require certain steps to happen and doesn't provide a human-controlled override.
Deleted Comment
There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all. Is this because for the most part our efforts in producing software are actually doing the opposite? It certainly seems that way reading articles like this.
> Instead of fixing the bug, department sources said employees are attempting to identify qualifying inmates manually... But sources say the department isn’t even scratching the surface of the entire number of eligible inmates. “The only prisoners that are getting into programming are the squeaky wheels,” a source said, “the ones who already know they qualify or people who have family members on the outside advocating for them.”
> In the meantime, Lamoreaux confirmed the “data is being calculated manually and then entered into the system.” Department sources said this means “someone is sitting there crunching numbers with a calculator and interpreting how each of the new laws that have been passed would impact an inmate.” “It makes me sick,” one source said, noting that even the most diligent employees are capable of making math errors that could result in additional months or years in prison for an inmate. “What the hell are we doing here? People’s lives are at stake.”
Comments like yours seem to glorify a pre-software world filled with manual entry. The reality is that manual entry is even more error-prone, bias-prone, with more people falling through the cracks.
If nothing else, software can be uniformly applied at a mass scale, and audited for any and all bugs. And faulty software can be exposed through leaks like the above, to expose and fix systemic problems. Whereas a world of manual entry simply ignores vast numbers of errors and biases which are extremely hard to detect/prove, and even then, can simply be scapegoated with some unlucky individuals, without any effort to fix systemically.
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
For every piece of software that can directly and materially harm someone's life like this, there should be a chain of responsibility. And within that chain, there should be legal recourse and, in most cases, penal consequences, especially in the case of inadequate software quality/testing/validation, should the software fail to perform its task correctly. Bonus side effect, software quality will go up across the board in the industry.
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
Loading comment...
I find the government "requirements" process tends to create situations like this. Rather than build flexible software that puts some degree of trust in the person using it, they tend to overspecify the current bureaucratic process. In many cases, the person pushing for the software is looking to use software to enforce bureaucratic control that they have been unable to otherwise exercise, with the effect of the people the project initiator wants to use the software simply working around it. They then institute all sorts of punishments and controls to insure it must be used. This then results in the kind of insane situation we have here, where you can't do something perfectly legal because "computer says no".
Loading comment...
Loading comment...
You are attacking the wrong target. It's the government that's broken. This kind of outrage can happen just as easily with pencil and paper. The root cause is the lack of accountability and desire to make the government function better.
Loading comment...
Loading comment...
Loading comment...
Loading comment...
The fact people are not asking that is worrying. I understand why the system was not designed to do something that happened later (even if it could have been reasonably foreseen) but the fact that it was implemented with no override is really the scandal.
I don't know whether this comes down to an amount of power that exists in a Governor that means the rest of the organisation can't say, "sorry Guv, but we can't do this because the software wasn't written to". If TV is to be believed, Governors want things done yesterday and you worry about the problems.
Loading comment...
Loading comment...
This is why penalties are such an important part of the feedback loop. Obviously we can't go back in time and restore someone's phone privileges, but we can award monetary damages for the mistake.
Monetary damages alone won't discourage this behavior, though, as ultimately taxpayers foot the bill. There also must be some degree of accountability for those in charge of the system. Software can't become a tool for dodging accountability. Those in charge of implementing the software, providing the inputs, and managing the outputs must be held accountable for related mistakes.
> There was an Ask HN question the other day where the poster asked if the software we are building is making the world a better place. There were hardly any replies at all.
Few Ask HN questions get many responses. This is also a loaded question, as HN is notorious for nit-picking every response and putting too much emphasis on the downsides. For example, I know farmers who have increased their farm productivity massively using modern hardware and software. However, if I posted that it would inevitably draw concerns about replacing human jobs, right-to-repair issues, and other issues surrounding the space. The world is definitely better off for having more efficient and productive farming techniques, freeing most of us up to do things other than farm.
However, all new advances bring a different set of problems. Instead of trying to force everything into broad categories of better or worse I think it's important to acknowledge that technology makes the world different. Different is a combination of better and worse. The modern world has different problems than we did 100 years ago, but given the choice I wouldn't choose to roll back to the pre-computer era.
> It certainly seems that way reading articles like this.
Both news and social media have a strong bias toward articles that spark anger or outrage. For me, the whole world stops feeling like a dumpster fire when I disconnect from news and social media for a while. I'm looking forward to the post-COVID era where we can get back to interacting with each other in person rather than gathering around a constant stream of negative stories on social media.
Loading comment...
You're a software developer maintaining an eCommerce platform, on the one hand your platform helps perpetuate low margin and wasteful consumerism, on the other hand your software enables small businesses to compete in the new online world.
Consumerism is bad, but commerce is as old as civilization and supports all of our lifestyles, so on a macro level you're in a tough spot. You're a talented developer putting their skills to work building something the community needs, I personally think that means you're doing good work in the context of your society, but it is difficult to say if it's making the world a better place.
Social media is the same. On the one hand, it connects family and friends, on the other it drives narcissisms, consumerism and misinformation.
You almost have to try and calculate the "Net Good" or "Net Bad" of a type of software and see how the cards fall. For social media I would suggest that it's currently in a "Net Bad" situation, causing more harm than good for example.
All government software should be open source and anyone should be able to investigate the code and submit bug reports, including inmates. If they know there is something wrong, they have a lot of time on their hands to learn a useful skill to fix these issues.
The government should then not be allowed to close a bug as wontfix or invalid without approval from other citizen watchdogs verifying if a bug report is legitimate.
Loading comment...
For a large segment of the US electorate, anything that inflicts pain on "bad people" is "making the world a better place".
If the software was causing prisoners to be released early, most US voters would be up in arms. But if they're being held too long, the calculus is different. In software terms, for many Americans, a "tough on crime" outcome is a "feature not a bug".
But as the complexity goes up and the number of these complex situations increases, are we reaching a point where we outstrip the amount of money, talent and experience our institutions would need to deliver solutions to successfully manage them?
With our resources and intelligence as a species being capped, it seems at some point this is inevitable.
Software does not have its own will. Software is only allowed to make decisions on our behalf because we let it do so.
Loading comment...
You can see in the film Brazil, from 35 years ago, that this was already a problem and concern even without modern software.
Loading comment...
I think the most likely explanation is just that people didn't see the question or weren't interested in having the discussion. Most people believe the work they're doing is at worst neutral. A less likely candidate for the reason (but still more likely than your guess) is that people didn't want to be subjected to unfounded criticism of their work from people who don't know anything about it.
The second and third order consequences is that developers will insulate themselves behind licensing and proofs of practice like every other industry.
Until people actually advocate for real penalties for such harmful violations they don’t care. All their temporary whining and crying is just blowing smoke up our asses.
Loading comment...
No, it's now all about "extracting value", "rent seeking", "subscriptions", "censorship", "monopoly" and "control". We got bribed by FAANG and this is the consequence.
Loading comment...
Loading comment...
It would be hard to see this in e.g. Scandinavian countries, where incarceration is seen as rehabilitative rather than punitive.
In the US, racial discrimination, free market extremism along with "tough on crime" laws have created unimaginably cruel systems; together with private prisons, the goal has been on cutting costs rather than rehabilitating prisoners. Software is just a tool to further that goal.
Loading comment...
Loading comment...
Loading comment...