Command Palette
Search for a command to run...
Wiki
Machine Learning Glossary: Explore definitions and explanations of key AI and ML concepts
Search for a command to run...
Machine Learning Glossary: Explore definitions and explanations of key AI and ML concepts
Search for a command to run...
Machine Learning Glossary: Explore definitions and explanations of key AI and ML concepts
The learning rule is a concept in neural network models that represents how the weights in the network are adjusted over time. This is generally viewed as a long-term dynamical rule.
The actor-critic algorithm is a reinforcement learning algorithm that combines a policy network and a value function. It uses the reward and punishment information of the results to calculate the probability of taking various actions under different states. It is also called the AC algorithm.
The task of the acoustic model is to calculate P(O|W), which is the probability of generating a speech waveform for the model. The acoustic model is one of the most important parts of the speech recognition system. It accounts for most of the computational overhead of speech recognition and determines the performance of the speech recognition system.
The adaptive bitrate algorithm is a video transmission technology that automatically adjusts the streaming media bitrate. The adjustment factors mainly depend on the network conditions or client delay.
The Tensor Processing Unit (TPU) is a special-purpose integrated circuit developed specifically for machine learning.
Oblique decision tree is also called multivariate decision tree. It is a decision tree in which the nodes use linear expressions of multiple attributes as the evaluation criteria.
Unordered attributes are attributes that cannot be arranged in order.
Restricted isometry property (RIP) is a property used to describe the relationship between nearly orthogonal matrices when dealing with problems such as sparse vectors.
Training examples refer to instances that are marked for training during the training process.
The support vector expansion is the expansion of the kernel function of the model's optimal solution through the training samples.
Sparsity refers to a situation where the proportion of 0 elements is large.
The state characteristic function is a characteristic function defined on the node and depends on the current position.
The True Prediction Rate (TPR) is the ratio of the number of positive sample predictions to the actual number of positive samples.
The true class refers to those samples that are correctly judged as the positive class in the binary classification problem.
True negatives (TN) refer to those samples that are correctly judged as negative in a binary classification problem.
Transductive learning is a method of predicting specific test samples by observing specific training samples.
Threshold shifting refers to adjusting the threshold for classifying categories according to actual conditions. It is often used to solve the problem of category imbalance.
Threshold Logic Unit (TLU) is the basic unit of neural network.
Threshold is also called critical value or threshold value. It is the value of a certain condition required for an object to undergo a certain change. It is a common term in academic research.
The least squares method is a mathematical optimization method that finds the best function matching the data by minimizing the sum of squared errors.
A tensor is a multilinear function that can be used to represent linear relationships between vectors, scalars, and other tensors.
Wasserstein Generative Adversarial Network has several advantages: It solves the problem of unstable GAN training, without the need to carefully balance the training degree of the generator and the discriminator; It basically solves the Collapse Mode problem and ensures the diversity of generated samples; There are problems such as cross entropy and quasi-[…]
The Viterbi algorithm is a dynamic programming algorithm.
The VC dimension is used to measure the capacity of a binary classifier.
The learning rule is a concept in neural network models that represents how the weights in the network are adjusted over time. This is generally viewed as a long-term dynamical rule.
The actor-critic algorithm is a reinforcement learning algorithm that combines a policy network and a value function. It uses the reward and punishment information of the results to calculate the probability of taking various actions under different states. It is also called the AC algorithm.
The task of the acoustic model is to calculate P(O|W), which is the probability of generating a speech waveform for the model. The acoustic model is one of the most important parts of the speech recognition system. It accounts for most of the computational overhead of speech recognition and determines the performance of the speech recognition system.
The adaptive bitrate algorithm is a video transmission technology that automatically adjusts the streaming media bitrate. The adjustment factors mainly depend on the network conditions or client delay.
The Tensor Processing Unit (TPU) is a special-purpose integrated circuit developed specifically for machine learning.
Oblique decision tree is also called multivariate decision tree. It is a decision tree in which the nodes use linear expressions of multiple attributes as the evaluation criteria.
Unordered attributes are attributes that cannot be arranged in order.
Restricted isometry property (RIP) is a property used to describe the relationship between nearly orthogonal matrices when dealing with problems such as sparse vectors.
Training examples refer to instances that are marked for training during the training process.
The support vector expansion is the expansion of the kernel function of the model's optimal solution through the training samples.
Sparsity refers to a situation where the proportion of 0 elements is large.
The state characteristic function is a characteristic function defined on the node and depends on the current position.
The True Prediction Rate (TPR) is the ratio of the number of positive sample predictions to the actual number of positive samples.
The true class refers to those samples that are correctly judged as the positive class in the binary classification problem.
True negatives (TN) refer to those samples that are correctly judged as negative in a binary classification problem.
Transductive learning is a method of predicting specific test samples by observing specific training samples.
Threshold shifting refers to adjusting the threshold for classifying categories according to actual conditions. It is often used to solve the problem of category imbalance.
Threshold Logic Unit (TLU) is the basic unit of neural network.
Threshold is also called critical value or threshold value. It is the value of a certain condition required for an object to undergo a certain change. It is a common term in academic research.
The least squares method is a mathematical optimization method that finds the best function matching the data by minimizing the sum of squared errors.
A tensor is a multilinear function that can be used to represent linear relationships between vectors, scalars, and other tensors.
Wasserstein Generative Adversarial Network has several advantages: It solves the problem of unstable GAN training, without the need to carefully balance the training degree of the generator and the discriminator; It basically solves the Collapse Mode problem and ensures the diversity of generated samples; There are problems such as cross entropy and quasi-[…]
The Viterbi algorithm is a dynamic programming algorithm.
The VC dimension is used to measure the capacity of a binary classifier.