Talk:Artificial intelligence

Page contents not supported in other languages.
Page semi-protected
From Wikipedia, the free encyclopedia

Article milestones
DateProcessResult
August 6, 2009Peer reviewReviewed

Semi-protected edit request on 13 November 2022

I'm requesting to add a section under "risks" of artificial intelligence

Gender Bias in Artificial Intelligence: As artificial intelligence continues to evolve and learn, it’s important to address the fact that the field of AI is extremely male dominated and how that impacts the way AI is learning language and values. In an article written by Susan Leavy from University College Dublin, she talks about the existence of the language used when referencing male and female roles. For example: the term “man-kind” and “man” referring to all of humanity, work roles such as firefighters being seen as a male role, and the words used to describe family such as how a father would be seen as a “family man” and that women don’t have an equal term. If these societal norms aren’t challenged throughout the advancement of AI, then the small ways that language differs between genders will be embedded into the AI’s memory and further reinforce gender inequality for future generations.

Leavy, Susan. “Gender Bias in Artificial Intelligence: Proceedings of the 1st International Workshop on Gender Equality in Software Engineering.” ACM Digital Library, 28 May 2018, https://dl.acm.org/doi/pdf/10.1145/3195570.3195580. Kawahsaki (talk) 19:57, 13 November 2022 (UTC)Reply[reply]

 Not done: Hello Kawahsaki, and welcome to Wikipedia! I'm afraid I have to decline to perform this request for a couple of reasons.
When creating edit requests one of the conditions for it being successful is that it be uncontroversial. Gender bias as a topic in whole is certainly controversial in the world today, and so the creation of an entire section based on such a topic would be out of scope here.
Additionally, I have concerns regarding the prose you've written. Wikipedia strives to maintain a neutral point of view when describing topics; our sole goal is to describe what reliable independent sources say on a given topic. This is because we are a tertiary source. Some of your prose seems to fall below this guideline. An example is the phrase it is important to address the fact. Wikipedia may state that a source believes something is important, but Wikipedia would not say something like this in it's own voice.
Now, this page is currently under what we call semi-protection. This means that only editors with accounts that are 3 days old and have a total of 10 edits may edit the page. If you make 9 more editors anywhere on Wikipedia (and there are plenty of eligible pages), and wait until November 16th, you'll be able to edit this page directly.
Feel free to drop by my talk page (Wikipedia's version of direct messages) if you have any questions, or you can ask them at the Teahouse, which is a venue that specializes in answering questions from new editors.
Cheers, and happy editing! —Sirdog (talk) 04:26, 14 November 2022 (UTC)Reply[reply]
I think you could add your contribution to the main article on algorithmic bias. This article only has room for a paragraph or so on the topic. ---- CharlesTGillingham (talk) 06:40, 28 November 2022 (UTC)Reply[reply]

using a non-circular (and more corrent) short description

The current short description is:

Intelligence demonstrated by machines

This is:

  1. circular
  2. machines is too narrow, more to do with machine learning.

Regarding #2, I can "do" AI, e.g. a FF neural network on paper, that is a much AI en silica.

I propose to use (in line with the main body text):

The ability of systems to perceive, synthesize, and infer information

Bquast (talk) Bquast (talk) 15:41, 15 November 2022 (UTC)Reply[reply]

  1. Don't think it's circular -- it just assumes the reader already knows what intelligence is. E.g., defining "puppy dog" as "a young dog".
  2. How about replacing "machines" with "machines or software"?
"perceive, synthesize, infer" ... hmm ... you left out "learn" ... and "knowledge" ... but, frankly, intelligence is so notoriously difficult to define that we're just opening a can of worms trying to define it here -- you'll have the whole history of philosophy and psychology picking away at you. Better to just leave it out.
My two cents. ---- CharlesTGillingham (talk) 06:24, 28 November 2022 (UTC)Reply[reply]

Why the Oxford Dictionary definition is inadequate

The article currently quotes the Oxford dictionary to define AI: "the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages."

This definition is rejected by the leading AI textbook (see Chpt. 2, Artificial Intelligence: A Modern Approach) and by AI founder John McCarthy (who coined the term "artificial intelligence") (see multiple citations in the article; just search for his name)

A brief introduction to the problems with definition:

The problem is this phrase: "tasks that normally require human intelligence". Consider these two lists:

Tasks that require considerable human intelligence:

  • Multiplying large numbers.
  • Memorizing long lists of information.
  • Doing high school algebra
  • Solving a linear differential equation
  • Playing chess at a beginner's level.

Tasks that do not require human intelligence (i.e. "unintelligent" small children or animals can do it):

  • Facial recognition
  • Visual perception
  • Speech recognition
  • Walking from room to room without bumping into something
  • Picking up an egg without breaking it
  • Noticing who is speaking

The Oxford definition categorizes programs that can do tasks from list 1 as AI, and categorize programs from list 2 as being outside of AI's scope. This is obviously not what is actually happening out in the field -- exactly the opposite, in most cases. All of the problems in list 1 were solved back in the 1960s, with computers far less powerful than the one in your microwave or clock radio. The problems in list 2 have only been solved recently, if at all.

Activities considered "intelligent" when a human does them can sometimes be relatively easy for machines, and sometimes activities that would never appear particularly "intelligent" when a human does them can be incredibly difficult for machines. (See Moravec's paradox) Thus the definition of artificial intelligence can't just be in terms of "human intelligence" -- a more general definition is needed. The Oxford dictionary definition is not adequate.

My recommendation

Scrap the extended definition all together: just stick with the naive common usage definition. Go directly to the examples (i.e. paragraph two of the lede)

Leave the difficult problem of defining "intelligence" (without reference to human intelligence) to the section "Defining AI" deeper in the article. This section considers the major issues, and should settle on "rationality" (i.e. goal-directed behavior) as Russell and Norvig do, and as John McCarthy did.---- CharlesTGillingham (talk) 04:31, 28 November 2022 (UTC)Reply[reply]

Actually, I just noticed, it doesn't exist any more! I will restore this very brief philosophical discussion, without any mention of "intelligent agents". And I will leave Google's definition as well. ---- CharlesTGillingham (talk) 04:53, 28 November 2022 (UTC)Reply[reply]
please review the history, the Russell definition was moved to the intelligent agent article, it is not adequate for artificial intelligence because it includes all kinds of procedural actions that are of interesting to fields like political science, but are not the essence of AI itself Bquast (talk) 14:29, 30 November 2022 (UTC)Reply[reply]
@Bquast: I have not re-added intelligent agents. I agreed with your proposal of eliminating "intelligent agent" from this article entirely. I re-added criticism of the Turing test and of human simulation as a definition of AI. I have restored it, if that's okay with you. ---- CharlesTGillingham (talk) 04:31, 4 December 2022 (UTC)Reply[reply]
@Bquast: By the way, I noticed the Google definition doesn't have a working citation, and I can't seem to find it. Would you mind fixing that? ---- CharlesTGillingham (talk) 05:22, 4 December 2022 (UTC)Reply[reply]
@CharlesTGillingham ok, sorry then I misunderstood your intention. In general I agree that this current definition is not good. Intelilgence can be human or animal (or plants?). I'm not sure about your list, many of the "dumb" tasks do require intelligence, I would not consider facial recognition _not_ intelilgence.
Regarding the link from Google, I put the direct citation of OED, but you can find it like this: https://www.google.com/search?q=artificial+intelligence+definition I will try to add it soon Bquast (talk) 02:49, 6 December 2022 (UTC)Reply[reply]

References, further reading, notes, etc. cleanup

A major cleanup is needed of all these sections. It seems like many authors have inserted their won (maybe) relevant material here. It should contain references of the text used. It should also avoid mentioning the same references in many different places, in particular the confusing Russell and Norvig book. Bquast (talk) 14:32, 30 November 2022 (UTC)Reply[reply]

Articles in this area are prone to reference spamming. I've done some work on this at related articles but not on this one. Also keeping a watch on a this and related articles so that it doesn't get worse. North8000 (talk) 21:55, 30 November 2022 (UTC)Reply[reply]
Wikipedia requires reliable sources. An article should include only citations to the most reliable sources as possible. These is no reason to include more references to less reliable sources, or to exclude references to the most reliable sources.
There is no more reliable source about AI than Russell and Norvig, the leading textbook, used in thousands of introductory university courses about AI. There is vast body of less reliable sources about AI. There is a lot of dissent, new ideas, outsider perspectives, home brews, sloppy journalism, self-promotion and so on. Wikipedia has to take a NPOV on this huge variety, and we don't have room to cover them all. Thus we, as editors, need to prove that every contribution reflects "mainstream" and "consensus" views on the subject. This is all we have room for. This all that is relevant here. The dozens of citations in this article to the leading text book are a way of showing that each contribution is mainstream and consensus, and a way of weeding out the fringe. ---- CharlesTGillingham (talk) 04:55, 4 December 2022 (UTC)Reply[reply]
Please take care that cites you remove are not still in use by referencing. Removing cite that have short form references causes "no target errors". -- LCU ActivelyDisinterested transmissions °co-ords° 09:37, 8 December 2022 (UTC)Reply[reply]

A Commons file used on this page or its Wikidata item has been nominated for deletion

The following Wikimedia Commons file used on this page or its Wikidata item has been nominated for deletion:

Participate in the deletion discussion at the nomination page. —Community Tech bot (talk) 22:24, 15 March 2023 (UTC)Reply[reply]

Not used here. CharlesTGillingham (talk) 10:10, 23 March 2023 (UTC)Reply[reply]