• Optoelectronics Letters
  • Vol. 21, Issue 3, 149 (2025)
Wenqiang LI, Min WU, Weijun LI, Meilan HAO, and Lina YU
DOI: 10.1007/s11801-025-4064-2 Cite this Article
LI Wenqiang, WU Min, LI Weijun, HAO Meilan, YU Lina. Unveiling the relationship between Fabry-Perot laser structures and optical field distribution via symbolic regression[J]. Optoelectronics Letters, 2025, 21(3): 149 Copy Citation Text show less
References

[1] GOSTIMIROVIC D, WINNIE N Y. Automating photonic design with machine learning[C]//2018 IEEE 15th International Conference on Group IV Photonics (GFP), August 29-31, 2018, Cancun, Mexico. New York: IEEE, 2018: 1-2.

[2] BARTH C, BECKER C. Machine learning classification for field distributions of photonic modes[J]. Communications physics, 2018, 1(1): 58.

[3] KARANOV B, CHAGNON M, THOUIN F, et al. End-to-end deep learning of optical fiber communications[J]. Journal of lightwave technology, 2018, 36(20): 4843-4855.

[4] TAHERSIMA M H, KOJIMA K, KOIKE-AKINO T, et al. Deep neural network inverse design of integrated photonic power splitters[J]. Scientific reports, 2019, 9(1): 1368.

[5] BAXTER J, CAL LESINA A, GUAY J M, et al. Plasmonic colours predicted by deep learning[J]. Scientific reports, 2019, 9(1): 8074.

[6] CHEN C L, MAHJOUBFAR A, TAI L C, et al. Deep learning in label-free cell classification[J]. Scientific reports, 2016, 6(1): 21471.

[7] BORHANI N, KAKKAVA E, MOSER C, et al. Learning to see through multimode fibers[J]. Optica, 2018, 5(8): 960-966.

[8] FANG Y, HAN H B, BO W B, et al. Deep neural network for modeling soliton dynamics in the mode-locked laser[J]. Optics letters, 2023, 48: 779-782.

[9] FORREST S. Genetic algorithms: principles of natural selection applied to computation[J]. Science, 1993, 261(5123): 872-878.

[10] KOZA J R. Genetic programming as a means for programming computers by natural selection[J]. Statistics and computing, 1994, 4: 87-112.

[11] PETERSEN B K, LARMA M L, MUNDHENK T N, et al. Deep symbolic regression: recovering mathematical expressions from data via risk-seeking policy gradients[EB/OL]. (2019-12-10) [2024-01-23]. https://arxiv.org/abs/1912.04871.

[12] LI W, LI W, YU L, et al. A neural-guided dynamic symbolic network for exploring mathematical expressions from data[EB/OL]. (2023-09-24) [2024-01-23]. https://arxiv.org/abs/2309.13705.

[13] LIPTON Z C, BERKOWITZ J, ELKAN C. A critical review of recurrent neural networks for sequence learning[EB/OL]. (2015-05-29) [2024-01-23]. https://arxiv.org/abs/1506.00019.

[14] MUNDHENK T N, LANDAJUELA M, GLATT R, et al. Symbolic regression via neural-guided genetic programming population seeding[EB/OL]. (2021-10-29)[2024-01-23]. https://arxiv.org/abs/2111.00053.

[15] LI W, LI W, SUN L, et al. Transformer-based model for symbolic regression via joint supervised learning[J]. International conference on learning representations, 2022.

[16] LIU J, LI W, YU L, et al. SNR: symbolic network-based rectifiable learning framework for symbolic regression[J]. Neural networks, 2023, 165: 1021-1034.

[17] WU M, LI W, YU L, et al. Discovering mathematical expressions through DeepSymNet: a classification-based symbolic regression framework[J]. IEEE transactions on neural networks and learning systems, 2023.

[18] OBADA D O, OKAFOR E, ABOLADE S A, et al. Explainable machine learning for predicting the band gaps of ABX3 perovskites[J]. Materials science in semiconductor processing, 2023, 161: 107427.

[19] MANTI S, SVENDSEN M K, KNSGAARD N R, et al. Exploring and machine learning structural instabilities in 2D materials[J]. NPJ computational materials, 2023, 9(1): 33.

[20] WILLHELM D, WILSON N, ARROYAVE R, et al. Predicting van der Waals heterostructures by a combined machine learning and density functional theory approach[J]. ACS applied materials & interfaces, 2022, 14(22): 25907-25919.

[21] HUANG P, LUKIN R, FALEEV M, et al. Unveiling the complex structure-property correlation of defects in 2D materials based on high throughput datasets[J]. NPJ 2D materials and applications, 2023, 7(1): 6.

[22] BREIMAN L. Random forests[J]. Machine learning, 2001, 45: 5-32.

[23] DRUCKER H, BURGES C J, KAUFMAN L, et al. Support vector regression machines[J]. Advances in neural information processing systems, 1996, 9.

[24] QUINLAN J R. C4.5: programs for machine learning[M]. Amsterdam: Elsevier, 2014.

[25] RUMELHART D E, HINTON G E, WILLIAMS R J. Learning representations by back-propagating errors[J]. Nature, 1986, 323(6088): 533-536.

[26] PEDREGOSA F, VAROQUAUX G, GRAMFORT A, et al. Scikit-learn: machine learning in Python[J]. Journal of machine learning research, 2011, 12: 2825-2830.

[27] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.

[28] LECUN Y, TOURESKY D, HINTON G, et al. A theoretical framework for back-propagation[J]. Proceedings of the connectionist models summer school san mateo ca, 1988, 1: 21-28.

LI Wenqiang, WU Min, LI Weijun, HAO Meilan, YU Lina. Unveiling the relationship between Fabry-Perot laser structures and optical field distribution via symbolic regression[J]. Optoelectronics Letters, 2025, 21(3): 149
Download Citation