Semantic communication systems driven by artificial intelligence often experience significant performance degradation in dynamic wireless channel environments.Existing approaches typically address this issue by integrating neural network modules with joint source-channel coding to process channel state information(CSI).While these methods improve system performance under varying channel conditions,the added modules introduce extra model parameters and computational overhead,increasing encoding-decoding latency.To address this challenge,we investigate the inherent properties of the latest Mamba model and derive its closed-form response to initial states.Our analysis reveals a characteristic forgetting behavior for initial state information.Based on this discovery,we propose an endogenous channel adaptation method.By incorporating CSI into the model's initial state and reinjecting CSI into the state space when it is forgotten,the proposed method enables the model to adaptively encode without additional computational or parameter overhead.This approach significantly enhances system performance under diverse channel conditions.