Chinese spelling correction has achieved significant progress, but critical challenges remain, especially in handling visually and phonetically similar errors within complex syntactic structures. This paper introduces a novel approach combining a Long Short-Term Memory Network (LSTM)-enhanced Transformer for error detection and Bidirectional Encoder Representations from Transformers (BERT)-based correction with a dynamic adaptive weighting scheme. Transformer uses global attention mechanism to capture dependencies between any two positions in the input sequence. By processing each token in the sequence recursively, LSTM is able to more finely capture local context and sequential information within the sequence. Based on adaptive weighting coefficient, weights of multi-task learning are automatically adjusted to help the model better balance the learning process between the detection and correction network, enabling it to converge faster and achieve higher precision. Comprehensive evaluations demonstrate improved performance over existing baselines, particularly in addressing complex error patterns.