|
Performance of the entropy as an index of classification accuracy in latent profile analysis: A Monte Carlo simulation study
WANG Meng-Cheng, DENG Qiaowen, BI Xiangyang, YE Haosheng, YANG Wendeng
2017, 49 (11):
1473-1482.
doi: 10.3724/SP.J.1041.2017.01473
Latent Profile Analysis (LPA) is a latent variable modeling technique that identifies latent (unobserved) subgroups of individuals within a population based on continuous indicators. LPA has become a popular statistical method for modelling unobserved population heterogeneity in social and behavioral science. Entropy is a standardized index of model-based classification accuracy, with higher values indicating more precise assignment of individuals to latent profiles. In lots of conditions, the aim of substantial research was to assign individual to different latent subgroup. Therefore, Entropy was chosen to report as an index reflecting accuracy of class membership assignment. Unfortunately, very few methodological studies have examined the behavior of Entropy under the conditions where sample sizes, latent class separations, number of indicators, and number of classes are varying. Thus, the primary purpose of this study was to examine how Entropy will perform with different sample sizes, latent class separations, number of indicators, and number of classes. By using Monte Carlo simulation techniques, we generated artificial data to fit true models and evaluated the performance of Entropy and entropy-based indexes (CLC, ICL_BIC, sample adjusted ICL_BIC) under different modeling conditions. The simulation was repeated 100 times for each condition of the 120 combinations: sample sizes (50, 100, 500, 1000, 3000), latent class separations (0.5, 1.2, 3), number of indicators (4, 8, 12, 20), and number of latent classes (3, 5). The continuous indicators of the latent class are not allowed to correlate. Different mean levels on the observed variables are calculated by Mahalanobis distance (MD). The simulations and analyses of the sample data were conducted using the Monte Carlo facilities of Mplus7.4. For 3 latent classes, Entropy values round 0.76 and above are related to at least 90% correct assignment, and Entropy values round 0.64 and below are related to at least 20% classification error rate. When the latent classes is 5, Entropy value around 0.84 and above are related to at least 90% correct assignment. The Entropy value decreases and the classification error rate increases as sample size increases. Entropy performs well under small sample sizes (50-100) and more indicators conditions. Entropy consistently performs better when latent class separation is large (MD=3), and the result is quite consistent across the sample size and number of latent classes. The tendency of CLC, ICL_BIC, and sample adjusted ICL_BIC were similar, which increases as sample size increases, and it also increases under large class separation but the differences of Entropy caused by class separation were more noticeable. This simulation indicates that the Entropy values are strongly correlated with the correct class membership assignment, but it varies according to number of latent classes, sample sizes, latent class separation and number of indicators. Hence, it is hard to determine cutoff values for Entropy, the indicator of class assignment.
Related Articles |
Metrics
|