A new federated learning scheme called FedEDS is proposed to enhance model training on data-heterogeneous edge devices while ensuring privacy through encrypted data sharing. This approach addresses issues related to network topology and data variability, accelerating convergence and improving model performance on decentralized systems. Experimental results demonstrate the effectiveness of FedEDS in optimizing training processes.