Wiki
We have compiled hundreds of related entries to help you understand "artificial intelligence"
Unsupervised learning is a learning method that does not provide corresponding category labels for the training set.
The sample space is the set of all possible outcomes of an experiment or random trial, and each possible outcome in a random trial is called a sample point.
A self-organizing map (SOM) or self-organizing feature map (SOFM) is an artificial neural network (ANN) that uses unsupervised learning to produce a low-dimensional (usually two-dimensional) discretized representation of the input space of training examples.
A recurrent neural network is a network model used to process sequence data. It means that the current output of a sequence is related to the previous output.
The rectified linear unit (ReLU), also known as the linear rectification function, is a commonly used activation function in artificial neural networks, usually referring to nonlinear functions represented by ramp functions and their variants.
Natural language understanding (NLU) is a technology that obtains the semantic representation of natural language through grammar, semantics, and pragmatics analysis. It is an important step in natural language processing.
Natural language generation (NLG) is a technology that studies how to enable computers to express and write like humans. That is, it can automatically generate a high-quality natural language text based on some key information and its internal expression form in the machine through a planning process.
Nash equilibrium, also known as non-cooperative game equilibrium, is an important strategy combination in game theory, named after economist John Nash.
Named entity recognition (NER), also known as "proper name recognition", refers to the process by which a computer recognizes named entities in a text. It is a basic NLP (natural language processing) task.
Unsupervised learning is a learning method that does not provide corresponding category labels for the training set.
The sample space is the set of all possible outcomes of an experiment or random trial, and each possible outcome in a random trial is called a sample point.
A self-organizing map (SOM) or self-organizing feature map (SOFM) is an artificial neural network (ANN) that uses unsupervised learning to produce a low-dimensional (usually two-dimensional) discretized representation of the input space of training examples.
A recurrent neural network is a network model used to process sequence data. It means that the current output of a sequence is related to the previous output.
The rectified linear unit (ReLU), also known as the linear rectification function, is a commonly used activation function in artificial neural networks, usually referring to nonlinear functions represented by ramp functions and their variants.
Natural language understanding (NLU) is a technology that obtains the semantic representation of natural language through grammar, semantics, and pragmatics analysis. It is an important step in natural language processing.
Natural language generation (NLG) is a technology that studies how to enable computers to express and write like humans. That is, it can automatically generate a high-quality natural language text based on some key information and its internal expression form in the machine through a planning process.
Nash equilibrium, also known as non-cooperative game equilibrium, is an important strategy combination in game theory, named after economist John Nash.
Named entity recognition (NER), also known as "proper name recognition", refers to the process by which a computer recognizes named entities in a text. It is a basic NLP (natural language processing) task.