KIEE
The Transactions of
the Korean Institute of Electrical Engineers
KIEE
Contact
Open Access
Monthly
ISSN : 1975-8359 (Print)
ISSN : 2287-4364 (Online)
http://www.tkiee.org/kiee
Mobile QR Code
The Transactions of the Korean Institute of Electrical Engineers
ISO Journal Title
Trans. Korean. Inst. Elect. Eng.
Main Menu
Main Menu
최근호
Current Issue
저널소개
About Journal
논문집
Journal Archive
편집위원회
Editorial Board
윤리강령
Ethics Code
논문투고안내
Instructions to Authors
연락처
Contact Info
논문투고·심사
Submission & Review
Journal Search
Home
Archive
2023-05
(Vol.72 No.05)
10.5370/KIEE.2023.72.5.607
Journal XML
XML
PDF
INFO
REF
References
1
T. Young, D. Hazarika, S. Poria, E. Cambria, 2018, Recent Trends in Deep Learning Based Natural Language Processing, IEEE Computational Intelligence Magazine, Vol. 13, pp. 55-75
2
M. I. Jordan, T. M. Mitchell, Jul 2015, Machine learning: Trends, perspectives, and prospects, Science, Vol. 349, No. 6245, pp. 255-260
3
R. Elshawi, M. Maher, S. Sakr, 2019, Automated Machine Learning: State-of-The-Art and Open Challenges, ArXiv190602287 Cs Stat
4
S. Abreu, 2019, Automated Architecture Design for Deep Neural Networks, ArXiv
5
K. He, X. Zhang, S. Ren, J. Sun, 2016, Deep Residual Learning for Image Recognition, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 770-778
6
N. Ma, X. Zhang, H.-T. Zheng, J. Sun, 2018, ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design, CoRR, Vol. abs/1807.11164, pp. -
7
J. Bergstra, R. Bardenet, Y. Bengio, B. Kégl, 2011, Algorithms for Hyper-Parameter Optimization, Advances in Neural Information Processing Systems, Vol. 24, pp. 2546-2554
8
J. Bergstra, Y. Bengio, 2012, Random Search for Hyper- Parameter Optimization, J. Mach. Learn. Res., Vol. 13, No. 10, pp. 281-305
9
B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, N. de Freitas, 2016, Taking the Human Out of the Loop: A Review of Bayesian Optimization, Proc. IEEE, Vol. 104, No. 1, pp. 148-175
10
T. Akiba, S. Sano, T. Yanase, T. Ohta, M. Koyama, 2019, Optuna: A Next-generation Hyperparameter Optimization Framework, Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 2623-2631
11
J. Bergstra, B. Komer, C. Eliasmith, D. Yamins, D. D. Cox, 2015, Hyperopt: a Python library for model selection and hyperparameter optimization, Comput. Sci. Discov., Vol. 8, No. 1, pp. 014008-
12
2022, KerasTuner
13
P. Probst, A.-L. Boulesteix, B. Bischl, 2019, Tunability: Importance of Hyperparameters of Machine Learning Algorithms, J. Mach. Learn. Res., Vol. 20, No. 53, pp. 1-32
14
M. Claesen, B. De Moor, Apr. 06, 2015, Hyperparameter Search in Machine Learning, arXiv
15
H. J. P. Weerts, A. C. Mueller, J. Vanschoren, Jul. 15, 2020, Importance of Tuning Hyperparameters of Machine Learning Algorithms, arXiv
16
V. Nair, G. E. Hinton, 2010, Rectified linear units improve restricted boltzmann machines, in Proceedings of the 27th International Conference on International Conference on Machine Learning, Madison, WI, USA, pp. 807-814
17
J. Brownlee, Jan. 22, 2019, How to Configure the Learning Rate When Training Deep Learning Neural Networks, Machine Learning Mastery
18
Y. Bengio, 2012, Practical Recommendations for Gradient-Based Training of Deep Architectures, in Neural Networks: Tricks of the Trade: Second Edition, G. Montavon, G. B. Orr, and K.-R. Müller, Eds. Berlin, Heidelberg: Springer, pp. 437-478
19
S. Agrawal, 2021, Hyperparameters in Deep Learning, Medium
20
, [Coursera] Neural Networks for Machine Learning (University of Toronto) (neuralnets)
21
D. P. Kingma, J. Ba, 2015, Adam: A Method for Stochastic Optimization, in 3rd International Conference on Learning Representations, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings
22
J. Duchi, E. Hazan, Y. Singer, 2011, Adaptive Subgradient Methods for Online Learning and Stochastic Optimization, Journal of machine learning research, Vol. 12, No. 7, pp. 39-
23
P. Liashchynskyi, P. Liashchynskyi, 2019, Grid search, random search, genetic algorithm: a big comparison for NAS, arXiv preprint arXiv:1912.06059
24
M. A. J. Idrissi, H. Ramchoun, Y. Ghanou, M. Ettaouil, 2016, Genetic algorithm for neural network architecture optimization, in 2016 3rd International Conference on Logistics Operations Management (GOL), pp. 1-4
25
J. Bergstra, R. Bardenet, Y. Bengio, B. Kégl, 2011, Algorithms for Hyper-Parameter Optimization, in Advances in Neural Information Processing Systems, Vol. 24
26
R. Joseph, 2018, Grid Search for model tuning, Medium
27
M.-A. Zöller, M. F. Huber, 2021, Benchmark and Survey of Automated Machine Learning Frameworks, ArXiv190412054 Cs Stat
28
A. Klein, S. Falkner, S. Bartels, P. Hennig, F. Hutter, 2017, Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets, in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, pp. 528-536
29
M. Seeger, 2004, Gaussian processes for machine learning, Int. J. Neural Syst., Vol. 14, No. 2, pp. 69-106
30
F. Hutter, H. H. Hoos, K. Leyton-Brown, 2011, Sequential Model-Based Optimization for General Algorithm Configuration, in Learning and Intelligent Optimization, pp. 507-523
31
D. Maclaurin, D. Duvenaud, R. Adams, 2015, Gradient-based Hyperparameter Optimization through Reversible Learning, in Proceedings of the 32nd International Conference on Machine Learning, pp. 2113-2122
32
A. S. Wicaksono, A. A. Supianto, 2018, Hyper Parameter Optimization using Genetic Algorithm on Machine Learning Methods for Online News Popularity Prediction, Int. J. Adv. Comput. Sci. Appl. IJACSA, Vol. 9, No. 12, pp. 33-31
33
L. Li, K. Jamieson, G. DeSalvo, A. Rostamizadeh, A. Talwalkar, 2022, Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization, International Conference on Learning Representations
34
2022, GitHub - fmfn/BayesianOptimization: A Python implementation of global optimization with gaussian processes, https://github. com/fmfn/BayesianOptimization
35
Mar. 18, 2022, Optuna: A hyperparameter optimization framework, optuna
36
Jan. 12, 2023, Hyperopt: Distributed Hyperparameter Optimization, hyperopt
37
K. Team, Jan. 13, 2023, Keras documentation: KerasTuner
38
L. Li, K. Jamieson, G. DeSalvo, A. Rostamizadeh, A. Talwalkar, 2017, Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization, , pp. 6765-6816
39
Md. H. A. Banna, 2021, Attention-Based Bi-Directional Long-Short Term Memory Network for Earthquake Prediction, IEEE Access, Vol. 9, pp. 56589-56603
40
Murat Koklu, Ilker Ali Ozkan, 2020, Multiclass classification of dry beans using computer vision and machine learning techniques, Comput. Electron. Agric., Vol. 174, pp. 105507-
41
İ. Çinar, M. Koklu, P. D. Ş. Taşdemi̇r, Dec. 2020, Classification of Raisin Grains Using Machine Vision and Artificial Intelligence Methods, Gazi Mühendis. Bilim. Derg., Vol. 6, No. 3, pp. -
42
L. Candillier, V. Lemaire, Aug. 2013, Active learning in the real-world design and analysis of the Nomao challenge, in The 2013 International Joint Conference on Neural Networks (IJCNN), pp. 1-8
43
A. Krizhevsky, I. Sutskever, G. E. Hinton, 2012, ImageNet Classification with Deep Convolutional Neural Networks, in Advances in Neural Information Processing Systems, Vol. 25
44
P. Srinivas, R. Katarya, Mar. 2022, hyOPTXg: OPTUNA hyper- parameter optimization framework for predicting cardiovascular disease using XGBoost, Biomed. Signal Process. Control, Vol. 73, pp. 103456-
45
J.-P. Lai, Y.-L. Lin, H.-C. Lin, C.-Y. Shih, Y.-P. Wang, P.-F. Pai, Feb. 2023, Tree-Based Machine Learning Models with Optuna in Predicting Impedance Values for Circuit Analysis, Micromachines, Vol. 14, No. 2, pp. -
46
J. Joy, M. P. Selvan, 2022, A comprehensive study on the performance of different Multi-class Classification Algorithms and Hyperparameter Tuning Techniques using Optuna, in 2022 International Conference on Computing, Communication, Security and Intelligent Systems (IC3SIS), pp. 1-5
47
Y. Nishitsuji, J. Nasseri, Mar. 2022, LSTM with forget gates optimized by Optuna for lithofacies prediction,
48
I. Ekundayo, 2020, OPTUNA Optimization Based CNN-LSTM Model for Predicting Electric Power Consumption, masters, Dublin, National College of Ireland
49
S. Putatunda, K. Rama, 2018, A Comparative Analysis of Hyperopt as Against Other Approaches for Hyper-Parameter Optimization of XGBoost, in Proceedings of the 2018 International Conference on Signal Processing and Machine Learning, Shanghai China, pp. 6-10
50
R. J. Borgli, H. Kvale Stensland, M. A. Riegler, P. Halvorsen, 2019, Automatic Hyperparameter Optimization for Transfer Learning on Medical Image Datasets Using Bayesian Optimization, in 2019 13th International Symposium on Medical Information and Communication Technology (ISMICT), pp. 1-6
51
J. Zhang, Q. Wang, W. Shen, Dec 2022, Hyper-parameter optimization of multiple machine learning algorithms for molecular property prediction using hyperopt library, Chin. J. Chem. Eng., Vol. 52, No. , pp. -
52
N. Schwemmle, T.-Y. Ma, May 2021, Hyperparameter Optimization for Neural Network based Taxi Demand Prediction, presented at the BIVEC-GIBET Benelux Interuniversity Association of Transport Researchers: Transport Research Days 2021
53
B. Abdellaoui, A. Moumen, Y. Idrissi, A. Remaida, 2021, Training the Fer2013 Dataset with Keras Tuner., pp. 412-
54
A. Jafar, M. Lee, 2021, High-speed hyperparameter optimization for deep ResNet models in image recognition, in Cluster Computing, pp. 1-9
55
A. Jafar, L. Myungho, Aug. 2020, Hyperparameter Optimization for Deep Residual Learning in Image Classification, in 2020 IEEE International Conference on Autonomic Computing and Self-Organizing Systems Companion (ACSOS-C), pp. 24-29