PodcastIntel
Sign in Get Started Free
Neural intel Pod
Neural intel Pod

Continual Forgetting for Pre-trained Vision Models

Apr 11, 2025 · 00:13:20
AI Summary
  • Research addresses continual forgetting in pre-trained vision models.
  • GS-LoRA++ selectively removes unwanted knowledge using group sparsity.
  • This parameter-efficient method retains performance on other tasks.

More from Neural intel Pod

View all episodes →

Get AI Summaries for Every New Episode

Subscribe to Neural intel Pod and get AI summaries, guest tracking, and email digests delivered automatically.

Sign Up Free →