Forum - View topicINTEREST: British Police Develop 'Psycho-Pass' AI to Predict Crimes Before They Happen
Goto page 1, 2 Next Note: this is the discussion thread for this article |
Author | Message | |||
---|---|---|---|---|
encrypted12345
Posts: 718 |
|
|||
I don't think we should use AI to predict criminality when Steam can't even give recommended games that make sense.
|
||||
Zin5ki
Posts: 6680 Location: London, UK |
|
|||
Our government watchdogs tend to resemble chihuahuas more than rottweilers, it should be said. (And regarding matters related to computing, they are trained to gratify their technophobic masters above anything else.) |
||||
harminia
Posts: 2003 Location: australia |
|
|||
ughhh minority report
|
||||
Beatdigga
Posts: 4372 Location: New York |
|
|||
Life should not resemble an episode of Black Mirror. Or Minority Report.
|
||||
Crabtree1
Posts: 106 Location: Aberdeenshire |
|
|||
I KNEW something like this was going to happen eventually! Please rename it the Sybil System, please!
Now if the UK government could give english and welsh police this to every officer they might be more useful |
||||
zunderdog24
Posts: 362 |
|
|||
Whats wrong with offering some support to people that have a high chance of commiting a crime. Theres also a good chance that incarceration rate will drop.
|
||||
Kyo Hisagi
Posts: 255 |
|
|||
If it won't be named Anime Crimes Division, I'm not interested
|
||||
Sobe
Posts: 882 |
|
|||
It's not so much of a question of what is morally wrong or correct with offering support but the fact that they plan to intervene when nothing occurred in the first place. It's probably the same issue the Minority Report brought up (I wouldn't know; never saw it) but if no one intervened, that doesn't automatically mean the crime was guaranteed to be committed in the first place anyway. It's like sending someone, one who has never committed a crime, to a mental asylum just because they think about death and/or committing murder every second they're alive. There's also a good chance the AI and the choices made from it will do more harm than good. Offering support isn't always the answer; Sometimes people have to learn on their own how to stand on their own two legs without support in order to really be saved. Who's to say the support that is offered based on the AI isn't worse than the support offered by someone or something else? There's always more than one answer but that doesn't mean all of those answers are the best. And best for who? Everyone but the future suspect? Only a few people at the cost of a majority? |
||||
Puniyo
Posts: 271 |
|
|||
True, but a great number of crimes are totally preventable and the police's reluctance to do anything until someone gets hurt is a big factor in it. Even if this isn't the right way to go about it, the police definitely need to be more proactive, but the constant cuts in funding don't allow them to do so. |
||||
CatSword
Posts: 1489 |
|
|||
The problem with "offering some support" is that the A.I. system won't be perfect. Perfectly normal people will be flagged and accused of having urges to commit heinous crimes. There will be more than a few stories of innocent people sobbing as the police terrorize them and ask them why they want to shoot up the local animal shelter. |
||||
Agent355
Posts: 5113 Location: Crackberry in hand, thumbs at the ready... |
|
|||
As the article points out, people are already more likely to be arrested based on factors like poverty and race rather than whether they are actually more likely to commit a crime. Cutting funds to social programs and police services in general and focusing only on people perceived to be more likely to commit crimes based on flawed AI is probably not a good idea, besides being generally creepy and susceptible to human rights abuses. |
||||
kotomikun
Posts: 1205 |
|
|||
...well, that's the issue. This isn't the right way to go about it. There's a persistent belief that "crime" is caused by evil forces separate from normal society, and the only solution is to go after the bad people who do crimes and punish them. The police may claim they want to help these pre-crime suspects, but in practice this system will be ripe for abuse, just like the rest of our justice system. Actual lasting crime reduction can only come about through structural changes to our society and culture that cut down on poverty, racism, and other factors that push people into situations where they're likely to resort to robbery or violence. Which has happened, to some degree, but we have a long way to go, and dystopian thoughtcrime technologies would only make the situation worse. |
||||
Weazul-chan
Posts: 625 Location: Michigan |
|
|||
obviously no decent science fiction fans were involved with this, otherwise it would have been scrapped as a bad idea given they'd know from reading/watching sci-fi with things like this just how wrong it can potentially go and its potential to be taken to harmfully extreme, ends-justify-the-means, levels.
|
||||
v1cious
Posts: 6203 Location: Houston, TX |
|
|||
No surprise really, we've been headed in this direction for awhile. You only need to look at what's happening in China now to see the endgame: https://twitter.com/Psythor/status/1056811593177227264
Welcome to the future. |
||||
WANNFH
Posts: 1700 |
|
|||
British, you say? You gonna bet that this system going to be biased for the person nationality being more risk factor more than actual facts about being criminal. And of course, it will be more imperfect so it will determine the harmless or joke message on the social network to the level of a real threat to the life. And of course - it will not help even a bit to prevent the actual crimes, despite all the costs.
Because hey, that's the actual British way. |
||||
All times are GMT - 5 Hours |
||
|
Powered by phpBB © 2001, 2005 phpBB Group