Genetic Algorithm-based Training Method for Memristor Neural Networks (2024)

CFP last date

20 December 2024

Call for Paper

January Edition

IJCA solicits high quality original research papers for the upcoming January edition of the journal. The last date of research paper submission is 20 December 2024

Submit your paper

Know more

The week's pick

Genetic Algorithm-based Training Method for Memristor Neural Networks (2)

Improved Shuffled Frog Leaping Algorithm with Self-Adaptive Shuffling for Fuzzy Logic PD+G Controller Optimization in Robotic Manipulators


Duc Hoang Nguyen

Random Articles
  • Opinion Mining: Aspect Level Sentiment Analysis using SentiWordNet and Amazon Web Services

    Jan

    2017

    A New Scalable Framework for Emulating Huge Networks

    February

    2014

    Transient Analysis of an Interdependent Forked Tandem Queuing Model with Load Dependent Service Rate

    November

    2011

    Feature Level Fusion in Multimodal Biometric Authentication System

    May

    2013

Reseach Article

Genetic Algorithm-based Training Method for Memristor Neural Networks

by Wei Zhang, Qingtian Zhang, Huaqiang Wu

Genetic Algorithm-based Training Method for Memristor Neural Networks (3)

International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 186 - Number 49
Year of Publication: 2024
Authors: Wei Zhang, Qingtian Zhang, Huaqiang Wu
Genetic Algorithm-based Training Method for Memristor Neural Networks (4) 10.5120/ijca2024924139

Wei Zhang, Qingtian Zhang, Huaqiang Wu . Genetic Algorithm-based Training Method for Memristor Neural Networks. International Journal of Computer Applications. 186, 49 ( Nov 2024), 7-13. DOI=10.5120/ijca2024924139

@article{ 10.5120/ijca2024924139,

author = { Wei Zhang, Qingtian Zhang, Huaqiang Wu },

title = { Genetic Algorithm-based Training Method for Memristor Neural Networks },

journal = { International Journal of Computer Applications },

issue_date = { Nov 2024 },

volume = { 186 },

number = { 49 },

month = { Nov },

year = { 2024 },

issn = { 0975-8887 },

pages = { 7-13 },

numpages = {9},

url = { https://ijcaonline.org/archives/volume186/number49/genetic-algorithm-based-training-method-for-memristor-neural-networks/ },

doi = { 10.5120/ijca2024924139 },

publisher = {Foundation of Computer Science (FCS), NY, USA},

address = {New York, USA}

}

%0 Journal Article

%1 2024-11-27T00:39:32.251813+05:30

%A Wei Zhang

%A Qingtian Zhang

%A Huaqiang Wu

%T Genetic Algorithm-based Training Method for Memristor Neural Networks

%J International Journal of Computer Applications

%@ 0975-8887

%V 186

%N 49

%P 7-13

%D 2024

%I Foundation of Computer Science (FCS), NY, USA

Abstract

In recent years, deep learning and large models have significantly advanced artificial intelligence applications in areas such as natural language processing and computer vision. However, as model scales grow, the demand for computing power increases, revealing the limitations of traditional von Neumann architectures. Memristor-based in-memory computing offers a promising alternative, yet neural networks deployed on memristor arrays suffer from accuracy loss due to device non-idealities. To address this, authors introduce a novel genetic algorithm (GA)- based training methodology specifically designed for memristor arrays to enhance neural network performance. authors detail the framework and strategic operations of this approach, supported by empirical validation using a series of lightweight models and demonstrate substantial accuracy improvements. Additionally, authors explore the impact of various hyperparameter settings on model precision. Overall, this approach significantly enhances the accuracy of lightweight neural networks on memristor arrays, with important implications for edge computing environments.

References
  1. Stefano Ambrogio, M Gallot, Katherine Spoon, Hsinyu Tsai, Charles Mackin, M Wesson, Sanjay Kariyappa, Pritish Narayanan, C-C Liu, A Kumar, et al. Reducing the impact of phase-change memory conductance drift on the inference of large-scale hardware neural networks. In 2019 IEEE International Electron Devices Meeting (IEDM), pages 6–1. IEEE, 2019.
  2. Yoshua Bengio, Nicholas L´eonard, and Aaron Courville. Estimating or propagating gradients through stochastic neurons for conditional computation. arXiv preprint arXiv:1308.3432, 2013.
  3. Guillem Boquet, Edwar Macias, Antoni Morell, Javier Serrano, Enrique Miranda, and Jose Lopez Vicario. Offline training for memristor-based neural networks. In 2020 28th European Signal Processing Conference (EUSIPCO), pages 1547– 1551. IEEE, 2021.
  4. Leon Chua. Memristor-the missing circuit element. IEEE Transactions on circuit theory, 18(5):507–519, 1971.
  5. Shuai Dong, Yihong Chen, Zhen Fan, Kaihui Chen, Minghui Qin, Min Zeng, Xubing Lu, Guofu Zhou, Xingsen Gao, and Jun-Ming Liu. A backpropagation with gradient accumulation algorithm capable of tolerating memristor non-idealities for training memristive neural networks. Neurocomputing, 494:89–103, 2022.
  6. Anteneh Gebregiorgis, Abhairaj Singh, Sumit Diware, Rajendra Bishnoi, and Said Hamdioui. Dealing with non-idealities in memristor based computation-in-memory designs. In 2022 IFIP/IEEE 30th International Conference on Very Large Scale Integration (VLSI-SoC), pages 1–6. IEEE, 2022.
  7. Xijie Huang, Zechun Liu, Shih-Yang Liu, and Kwang-Ting Cheng. Efficient quantization-aware training with adaptive coreset selection. arXiv preprint arXiv:2306.07215, 2023.
  8. YeonJoo Jeong, Mohammed A Zidan, and Wei D Lu. Parasitic effect analysis in memristor-array-based neuromorphic systems. IEEE Transactions on Nanotechnology, 17(1):184– 193, 2017.
  9. Dovydas Joksas, Pedro Freitas, Zheng Chai, Wing H Ng, Mark Buckwell, C Li, WD Zhang, Q Xia, AJ Kenyon, and A Mehonic. Committee machines—a universal method to deal with non-idealities in memristor-based neural networks. Nature communications, 11(1):4273, 2020.
  10. Dovydas Joksas, Erwei Wang, Nikolaos Barmpatsalos, Wing H Ng, Anthony J Kenyon, George A Constantinides, and Adnan Mehonic. Nonideality-aware training for accurate and robust low-power memristive neural networks. Advanced Science, 9(17):2105784, 2022.
  11. Manuel Le Gallo, Abu Sebastian, Roland Mathis, Matteo Manica, Heiner Giefers, Tomas Tuma, Costas Bekas, Alessandro Curioni, and Evangelos Eleftheriou. Mixedprecision in-memory computing. Nature Electronics, 1(4):246–253, 2018.
  12. Can Li, Daniel Belkin, Yunning Li, Peng Yan, Miao Hu, Ning Ge, Hao Jiang, Eric Montgomery, Peng Lin, Zhongrui Wang, et al. Efficient and self-adaptive in-situ learning in multilayer memristor neural networks. Nature communications, 9(1):2385, 2018.
  13. Junrui Li, Zhekang Dong, Li Luo, Shukai Duan, and Lidan Wang. A novel versatile window function for memristor model with application in spiking neural network. Neurocomputing, 405:239–246, 2020.
  14. Yunsheng Li, Yinpeng Chen, Xiyang Dai, Dongdong Chen, Mengchen Liu, Lu Yuan, Zicheng Liu, Lei Zhang, and Nuno Vasconcelos. Micronet: Improving image recognition with extremely low flops. In Proceedings of the IEEE/CVF International conference on computer vision, pages 468–477, 2021.
  15. Arturo Marban, Daniel Becking, SimonWiedemann, andWojciech Samek. Learning sparse & ternary neural networks with entropy-constrained trained ternarization (ec2t). In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 722–723, 2020.
  16. Adnan Mehonic, Dovydas Joksas, Nikolaos Barmpatsalos, Wing H Ng, Anthony J Kenyon, Erwei Wang, and George Constantinides. Mitigating non-idealities of memristivebased artificial neural networks-an algorithmic approach. In 2022 6th IEEE Electron Devices Technology & Manufacturing Conference (EDTM), pages 399–401. IEEE, 2022.
  17. Wen-Qian Pan, Jia Chen, Rui Kuang, Yi Li, Yu-Hui He, Gui- Rong Feng, Nian Duan, Ting-Chang Chang, and Xiang-Shui Miao. Strategies to improve the accuracy of memristor-based convolutional neural networks. IEEE Transactions on Electron Devices, 67(3):895–901, 2020.
  18. Huanhuan Ran, Shiping Wen, Shiqin Wang, Yuting Cao, Pan Zhou, and Tingwen Huang. Memristor-based edge computing of shufflenetv2 for image classification. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 40(8):1701–1710, 2020.
  19. Mark Sandler, Andrew Howard, Menglong Zhu, Andrey Zhmoginov, and Liang-Chieh Chen. Mobilenetv2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 4510–4520, 2018.
  20. Ajay Shrestha and Ausif Mahmood. Review of deep learning algorithms and architectures. IEEE access, 7:53040–53065, 2019.
  21. Jake Snell, Kevin Swersky, and Richard Zemel. Prototypical networks for few-shot learning. Advances in neural information processing systems, 30, 2017.
  22. Dmitri B Strukov, Gregory S Snider, Duncan R Stewart, and R Stanley Williams. The missing memristor found. nature, 453(7191):80–83, 2008.
  23. Junwei Sun, Xiao Xiao, Peng Liu, and Yanfeng Wang. Multiple target recognition and position identification circuit based on memristor. AEU-International Journal of Electronics and Communications, 151:154223, 2022.
  24. Kaixuan Sun, Jingsheng Chen, and Xiaobing Yan. The future of memristors: Materials engineering and neural networks. Advanced Functional Materials, 31(8):2006773, 2021.
  25. Yaoyuan Wang, Shuang Wu, Lei Tian, and Luping Shi. Ssm: a high-performance scheme for in situ training of imprecise memristor neural networks. Neurocomputing, 407:270–280, 2020.
  26. Zhongrui Wang, Can Li, Peng Lin, Mingyi Rao, Yongyang Nie, Wenhao Song, Qinru Qiu, Yunning Li, Peng Yan, John Paul Strachan, et al. In situ training of feed-forward and recurrent convolutional memristor networks. Nature Machine Intelligence, 1(9):434–442, 2019.
  27. Yuting Wu, Qiwen Wang, Ziyu Wang, Xinxin Wang, Buvna Ayyagari, Siddarth Krishnan, Michael Chudzik, and Wei D Lu. Bulk-switching memristor-based compute-inmemory module for deep neural network training. Advanced Materials, 35(46):2305465, 2023.
  28. Le Yang, Zhigang Zeng, and Xinming Shi. A memristorbased neural network circuit with synchronous weight adjustment. Neurocomputing, 363:114–124, 2019.
  29. Peng Yao, Huaqiang Wu, Bin Gao, Jianshi Tang, Qingtian Zhang, Wenqiang Zhang, J Joshua Yang, and He Qian. Fully hardware-implemented memristor convolutional neural network. Nature, 577(7792):641–646, 2020.
  30. Su-in Yi, Jack D Kendall, R Stanley Williams, and Suhas Kumar. Activity-difference training of deep neural networks using memristor crossbars. Nature Electronics, 6(1):45–51, 2023.
  31. Yongbin Yu, Jiehong Mo, Quanxin Deng, Chen Zhou, Biao Li, Xiangxiang Wang, Nijing Yang, Qian Tang, and Xiao Feng. Memristor parallel computing for a matrix-friendly genetic algorithm. IEEE Transactions on Evolutionary Computation, 26(5):901–910, 2022.
  32. Wei Zhang, Lunshuai Pan, Xuelong Yan, Guangchao Zhao, Hong Chen, Xingli Wang, Beng Kang Tay, Gaokuo Zhong, Jiangyu Li, and Mingqiang Huang. Hardwarefriendly stochastic and adaptive learning in memristor convolutional neural networks. Advanced Intelligent Systems, 3(9):2100041, 2021.
  33. Wenbin Zhang, Peng Yao, Bin Gao, Qi Liu, Dong Wu, Qingtian Zhang, Yuankun Li, Qi Qin, Jiaming Li, Zhenhua Zhu, et al. Edge learning using a fully integrated neuro-inspired memristor chip. Science, 381(6663):1205–1211, 2023.
  34. Xingqi Zou, Sheng Xu, Xiaoming Chen, Liang Yan, and Yinhe Han. Breaking the von neumann bottleneck: architecture-level processing-in-memory technology. Science China Information Sciences, 64(6):160404, 2021.
Index Terms

Computer Science

Information Sciences

Memristor neural networks

in-memory computing

genetic algorithm

Keywords

Algorithms Optimization Image Classification

Genetic Algorithm-based Training Method for Memristor Neural Networks (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Nathanial Hackett

Last Updated:

Views: 5982

Rating: 4.1 / 5 (52 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Nathanial Hackett

Birthday: 1997-10-09

Address: Apt. 935 264 Abshire Canyon, South Nerissachester, NM 01800

Phone: +9752624861224

Job: Forward Technology Assistant

Hobby: Listening to music, Shopping, Vacation, Baton twirling, Flower arranging, Blacksmithing, Do it yourself

Introduction: My name is Nathanial Hackett, I am a lovely, curious, smiling, lively, thoughtful, courageous, lively person who loves writing and wants to share my knowledge and understanding with you.