Invention Title:

METHOD AND DEVICE WITH BAYESIAN META CONTINUAL-LEARNING AND INFERRING

Publication number:

US20260037831

Publication date:
Section:

Physics

Class:

G06N3/0985

Inventors:

Assignees:

Applicants:

Smart overview of the Invention

The patent application introduces a method and device for meta continual-learning and inferring, leveraging Bayesian principles. It focuses on learning from non-stationary data streams, where the statistical properties evolve over time. The approach involves a sequential Bayesian update mechanism, aiming to mitigate issues like catastrophic forgetting, a common problem in continual-learning where new data overwrites old knowledge, leading to performance degradation on previously learned data.

Core Methodology

The method employs Bayes' theorem to calculate the likelihood of learning data for a latent variable using a data distribution learner. A Bayes' calculator performs a sequential Bayesian update, calculating the final posterior distribution of the latent variable. This distribution is then used to sample the latent variable, which is crucial for inferring test output data. The process is supported by meta-learning, training neural networks of the data distribution learner, the Bayes' calculator, and the inference engine.

Implementation Details

The Bayesian update process involves expressing prior and posterior distributions in an exponential family form, often utilizing Gaussian distributions. Sampling from the final posterior distribution is achieved using methods like Monte Carlo approximations or reparameterization tricks. The inference engine, potentially a generative type, uses the sampled latent variable to infer data distribution from test input data.

Device Configuration

The described device includes components such as a data distribution learner, a Bayes' calculator, and an inference engine, all trained through meta-learning. It may also incorporate a Monte Carlo approximator for sampling purposes. The device's architecture consists of layers performing sequential Bayesian update-based meta inference and meta learning on neural network parameters, enhancing efficiency and accuracy.

Benefits and Performance

The proposed method and device aim to improve mean accuracy in meta continual learning while enhancing operational and memory efficiency. By addressing the challenges of continual-learning, such as catastrophic forgetting, the approach ensures robust performance across both new and previously encountered data, making it a significant advancement in the field of machine learning.