Skip to main content
Version: 2024.05-cpu

DP

📄️ compress model

The number of Embedding net networks in the DP model is $N^2$ times the number of atom types $N$. As the number of atom types increases, the number of Embedding net networks will rapidly increase, leading to a larger computational graph for backpropagation, which becomes one of the bottlenecks for inference in the DP model. The time statistics for the inference process of a quinary alloy system in the DP model show that the time spent on Embedding net computation and gradient calculation accounts for over 90% of the total time, indicating significant optimization potential.

📄️ type embedding

Since the Embedding Net of the DP model is $N^2$ times the number of element types $N$, it poses limitations on both the training and inference speed when there are a large number of element types in the system. Additionally, this also restricts the potential of the DP model in terms of scalability to larger models. Considering that the $N^2$ Embedding nets implicitly encode information about the element types, we can achieve similar effects by adjusting $S$ and concatenating the physical property information of the element types with $S{ij}$. This way, we only need a single Embedding Net instead of $N^2$.