Binary Neural Networks Algorithms, Architectures, and Applications

Farid-Khan

Uploader
LV
5
 
Csatlakozás
2023.06.08.
Üzenetek
29,158
Reakció pontszám
195
Díjak
6
Kor
36
w0mpxq9m9du9.jpg
Binary Neural Networks: Algorithms, Architectures, and Applications | 218 | Baochang Zhang |​

Deep learning has achieved impressive results in image classification, computer vision, and natural language processing. To achieve better performance, deeper and wider networks have been designed, which increase the demand for computational resources. The number of floatingpoint operations (FLOPs) has increased dramatically with larger networks, and this has become an obstacle for convolutional neural networks (CNNs) being developed for mobile and embedded devices. In this context, Binary Neural Networks: Algorithms, Architectures, and Applications will focus on CNN compression and acceleration, which are important for the research community. We will describe numerous methods, including parameter quantization, network pruning, low-rank decomposition, and knowledge distillation. More recently, to reduce the burden of handcrafted architecture design, neural architecture search (NAS) has been used to automatically build neural networks by searching over a vast architecture space. Our book will also introduce NAS and its superiority and state-of-the-art performance in various applications, such as image classification and object detection. We also describe extensive applications of compressed deep models on image classification, speech recognition, object detection, and tracking. These topics can help researchers better understand the usefulness and the potential of network compression on practical applications. Moreover, interested readers should have basic knowledge of machine learning and deep learning to better understand the methods described in this book.

Key Features

• Reviews recent advances in CNN compression and acceleration

• Elaborates recent advances on binary neural network (BNN) technologies

• Introduces applications of BNN in image classification, speech recognition, object detection, and more

Baochang Zhang is a full professor with the Institute of Artificial Intelligence, Beihang University, Beijing, China. He was selected by the Program for New Century Excellent Talents in the University of Ministry of Education of China, chosen as the Academic Advisor of the Deep Learning Lab of Baidu Inc., and was honored as a Distinguished Researcher of Beihang Hangzhou Institute in Zhejiang Province. His research interests include explainable deep learning, computer vision, and pattern recognition. His HGPP and LDP methods were state-of-the-art feature descriptors, with 1234 and 768 Google Scholar citations, respectively, and both "Test-of-Time" works. His team's 1-bit methods achieved the best performance on ImageNet. His group also won the ECCV 2020 Tiny Object Detection, COCO Object Detection, and ICPR 2020 Pollen recognition challenges.

Sheng Xu received a BE in automotive engineering from Beihang University, Beijing, China. He has a PhD and is currently at the School of Automation Science and Electrical Engineering, Beihang University, specializing in computer vision, model quantization, and compression. He has made significant contributions to the field and has published about a dozen papers as the first author in top-tier conferences and journals such as CVPR, ECCV, NeurIPS, AAAI, BMVC, IJCV, and ACM TOMM. Notably, he has 4 papers selected as oral or highlighted presentations by these prestigious conferences. Furthermore, Dr. Xu actively participates in the academic community as a reviewer for various international journals and conferences, including CVPR, ICCV, ECCV, NeurIPS, ICML, and IEEE TCSVT. His expertise has also led to his group's victory in the ECCV 2020 Tiny Object Detection Challenge.

Mingbao Lin finished his MS-PhD study and obtained a PhD in intelligence science and technology from Xiamen University, Xiamen, China in 2022. In 2016, he received a BS from Fuzhou University, Fuzhou, China. He is currently a senior researcher with the Tencent Youtu Lab, Shanghai, China. His publications on top-tier conferences/journals include: IEEE TPAMI, IJCV, IEEE TIP, IEEE TNNLS, CVPR, NeurIPS, AAAI, IJCAI, ACM MM, and more. His current research interests include developing an efficient vision model, as well as information retrieval.

Tiancheng Wang received a BE in automation from Beihang University, Beijing, China. He is currently pursuing a PhD with the Institute of Artificial Intelligence, Beihang University. During his undergraduate studies, he was given the Merit Student Award for several consecutive years, and has received various scholarships including academic excellence and academic competitions scholarships. He was involved in several AI projects including behavior detection and intention understanding research and unmanned air-based vision platform, and more. Now his current research interests include deep learning and network compression; his goal is to explore a high energy-saving model and drive the deployment of neural networks in embedded devices.

Dr. David Doermann is a professor of empire innovation at the University at Buffalo (UB), New York, US, and the director of the University at Buffalo Artificial Intelligence Institute. Prior to coming to UB, he was a program manager at the Defense Advanced Research Projects Agency (DARPA) where he developed, selected, and oversaw approximately $150 million in research and transition funding in the areas of computer vision, human language technologies, and voice analytics. He coordinated performers on all projects, orchestrating consensus, evaluating cross team management, and overseeing fluid program objectives.



Contents of Download:
Binary Neural Networks Algorithms, Architectures, and Applications.pdf (30.37 MB)


NitroFlare Link(s) (Premium Link)
Code:
            
                
                
                    
                   
                    A kód megtekintéséhez jelentkezz be.
					Please log in to view the code.
                
            
        
RapidGator Link(s)
Code:
            
                
                
                    
                   
                    A kód megtekintéséhez jelentkezz be.
					Please log in to view the code.
                
            
        
 
Top Alul