Wiki
We have compiled hundreds of related entries to help you understand "artificial intelligence"
Hardware acceleration refers to the process of assigning very computationally intensive tasks to specialized hardware for processing in a computer, which can reduce the workload of the central processing unit and is more efficient than software running solely on a general-purpose CPU.
Parallel computing is a sub-field of high performance computing (HPC). Relative to serial computing, it is a computing mode that improves computing efficiency by executing multiple tasks on multiple processors or computers simultaneously.
High-throughput computing (HTC) is defined as a type of computing that aims to use resources to run a large number of computing tasks in parallel.
The term High Performance Computing (HPC) emerged after the term "supercomputing" and is a field of computing that uses powerful computing resources to solve complex problems.
The Big Language Model is an artificial intelligence algorithm that uses a neural network with a large number of parameters to process and understand human language or text using self-supervised learning techniques.
Output modulation is a method of transforming output representation and causing disturbances, which is often used to increase the diversity of learners. It is to transform classification output into regression output and then construct individual learners.
Random forest is a versatile algorithm that contains multiple decision trees.
Random walk is a statistical model consisting of a series of random action trajectories, which is used to represent irregular changes.
Neural Machine Translation (NMT) is a machine translation framework based on pure neural networks. It uses neural networks to achieve end-to-end translation from source language to target language.
The Neural Turing Machine is a Turing Machine based on a neural network. It is inspired by the Turing Machine and can implement a machine algorithm for differential functions. It includes a neural network controller and external memory.
The same strategy means that the strategy for generating samples is the same as the strategy used when the network updates parameters. A typical example of the same strategy method is the SARAS algorithm.
Receiver Operating Characteristic (ROC) is a test indicator of a system matching algorithm. It is a relationship between the matching score threshold, false positive rate, and rejection rate. It reflects the balance between the rejection rate and false positive rate of the recognition algorithm at different thresholds.
Restricted Boltzmann machine is a kind of random neural network model with two-layer structure, symmetrical connection and no self-feedback.
Simultaneous Localization and Mapping (SLAM) is a technique used in robotics.
Statistical learning is a discipline that builds probabilistic statistical models based on data to predict and analyze data, also known as statistical machine learning.
The alternative loss function is a function used when the original loss function is inconvenient to calculate.
Upsampling, or image interpolation, is mainly used to enlarge the original image so that it can be displayed on a higher resolution display device.
The vanishing gradient problem is a problem encountered when training artificial neural networks using gradient descent and backpropagation.
T-Distributed Stochastic Neighbor Embedding (t-SNE) is a machine learning method for dimensionality reduction.
Treebank is a deep-processed corpus that performs word segmentation, part-of-speech tagging, and syntactic structure relationship tagging on sentences.
Turing machine, also known as deterministic Turing machine, is an abstract computing model proposed by Alan Turing in 1936. Its more abstract meaning is a mathematical logic machine, which can be regarded as the ultimate powerful logic machine equivalent to any finite logical mathematical process.
Specialization is a process from general to specific
A synonym set is a collection of words with the same meaning.
The time step defines how small the time intervals between physics simulations are. In a game engine, this reflects how often a function needs to run.
Hardware acceleration refers to the process of assigning very computationally intensive tasks to specialized hardware for processing in a computer, which can reduce the workload of the central processing unit and is more efficient than software running solely on a general-purpose CPU.
Parallel computing is a sub-field of high performance computing (HPC). Relative to serial computing, it is a computing mode that improves computing efficiency by executing multiple tasks on multiple processors or computers simultaneously.
High-throughput computing (HTC) is defined as a type of computing that aims to use resources to run a large number of computing tasks in parallel.
The term High Performance Computing (HPC) emerged after the term "supercomputing" and is a field of computing that uses powerful computing resources to solve complex problems.
The Big Language Model is an artificial intelligence algorithm that uses a neural network with a large number of parameters to process and understand human language or text using self-supervised learning techniques.
Output modulation is a method of transforming output representation and causing disturbances, which is often used to increase the diversity of learners. It is to transform classification output into regression output and then construct individual learners.
Random forest is a versatile algorithm that contains multiple decision trees.
Random walk is a statistical model consisting of a series of random action trajectories, which is used to represent irregular changes.
Neural Machine Translation (NMT) is a machine translation framework based on pure neural networks. It uses neural networks to achieve end-to-end translation from source language to target language.
The Neural Turing Machine is a Turing Machine based on a neural network. It is inspired by the Turing Machine and can implement a machine algorithm for differential functions. It includes a neural network controller and external memory.
The same strategy means that the strategy for generating samples is the same as the strategy used when the network updates parameters. A typical example of the same strategy method is the SARAS algorithm.
Receiver Operating Characteristic (ROC) is a test indicator of a system matching algorithm. It is a relationship between the matching score threshold, false positive rate, and rejection rate. It reflects the balance between the rejection rate and false positive rate of the recognition algorithm at different thresholds.
Restricted Boltzmann machine is a kind of random neural network model with two-layer structure, symmetrical connection and no self-feedback.
Simultaneous Localization and Mapping (SLAM) is a technique used in robotics.
Statistical learning is a discipline that builds probabilistic statistical models based on data to predict and analyze data, also known as statistical machine learning.
The alternative loss function is a function used when the original loss function is inconvenient to calculate.
Upsampling, or image interpolation, is mainly used to enlarge the original image so that it can be displayed on a higher resolution display device.
The vanishing gradient problem is a problem encountered when training artificial neural networks using gradient descent and backpropagation.
T-Distributed Stochastic Neighbor Embedding (t-SNE) is a machine learning method for dimensionality reduction.
Treebank is a deep-processed corpus that performs word segmentation, part-of-speech tagging, and syntactic structure relationship tagging on sentences.
Turing machine, also known as deterministic Turing machine, is an abstract computing model proposed by Alan Turing in 1936. Its more abstract meaning is a mathematical logic machine, which can be regarded as the ultimate powerful logic machine equivalent to any finite logical mathematical process.
Specialization is a process from general to specific
A synonym set is a collection of words with the same meaning.
The time step defines how small the time intervals between physics simulations are. In a game engine, this reflects how often a function needs to run.