Privacy of user is becoming increasingly significant in constructing efficient multiagent energy management systems for multimicrogrid (MMG). As an emerging privacy-protection method, federated learning (FL) has been used to prevent data breaches in the MMG-related field. However, with the ever-growing participants, the underlying communication burden existing in FL is evident. Besides, since the neural network layers collectively determine an agent's performance, the possible difference in layer convergence speeds would cause the inconsistency problem, that is, the FL may degrade the convergence rate of those fast-convergent layers, which weakens the overall performance of the agent. To address these issues, a communication-efficient FL (CEFL) algorithm is proposed in this study. Considering the cooperative relationship among layers, a layer evaluation (LE) mechanism is developed in CEFL to evaluate layer contribution through the Shapley value (SV), a profit distribution approach for coalitions. In this way, only partial layers with the highest contributions are selected to be uploaded to the server. In addition, instead of average parameters aggregation, a communication-efficient parameter aggregation method is proposed in CEFL to update the parameters of the global model (GM), in which an aggregation model (AM) is developed to receive parameters for aggregation. The performance of the proposed CEFL is verified by the numerical analysis of MMGs with 3-8 MGs participating. Furthermore, experiments investigate the influence of the hyperparameter in the CEFL and also demonstrate performance improvements, compared with the other four state-of-the-art algorithms.