《iCV TA&K:2023人工智能芯片市場研究報告(英文版)(20頁).pdf》由會員分享,可在線閱讀,更多相關《iCV TA&K:2023人工智能芯片市場研究報告(英文版)(20頁).pdf(20頁珍藏版)》請在三個皮匠報告上搜索。
1、AI ChipMarket Research Report2023Copyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.This report mainly introduces artificial intelligence chips and their classification,market size,and applications
2、.Artificial intelligence chips,also known as AI accelerators or computing cards,are computing chips specialized in artificial intelligence algorithms.There are currently two development paths for artificial intelligence chips.One is a traditional computing architecture represented by GPU,FPGA,ASIC(T
3、PU,BPU,etc.);Another approach is to subvert the classic von Neumann computing architecture and use brain like neural structures to enhance computing power.According to the deployment location,AI chips can be divided into cloud AI chips,edge AI chips,and terminal AI chips.In the future,With more and
4、more intelligent devices such as smart cars,smartwatches,smart appliances,VR/AR,etc.entering peoples lives,the amount of data collected by edges and terminals is increasing exponentially,the integration of edge cloud computing power layout solutions will become mainstream.Copyright 2023 by ICV TAnK.
5、This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.IntroductionAI Chip Development HistoryCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permissi
6、on of the publisher.AI Chip Development History AI chips are the underlying cornerstone of AI development.NVIDIA invented the GPU as early as 1999,but it was not until 2009 that Stanford University published a paper describing how to use the computing power of modern GPUs to far exceed that of multi
7、-core CPUs(more than 70 times)to shorten AI training time from weeks to hours.Copyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.1950s-1960s1970s1980s1990s2000s2010s2020s195819711982199120062011202
8、12023Invention of the IntegratedCircuitIntel 4004,the first microprocessorIntroduction of the first RISC processorsEmergence of Digital Signal Processors(DSPs)Multi-coreprocessorsbecomemainstreamDevelopmentof advancedneural networkprocessorsRise of Al-specific chipslike GooglesTPUIntegration of Alca
9、pabilities inmainstreamprocessorsThe Development Path Of AI ChipThere are currently two development paths for artificial intelligence chips.One is a traditional computing architecture represented by GPU,FPGA,ASIC(TPU,BPU,etc.)TYPESDEGREE OF CUSTOMIZATIONADVANTAGEDISADVANTAGEGPUGeneralStrong computin
10、g powerProduct matureLow efficiencyProgramming is difficultFPGASemi customizationHigh average performanceLow power consumptionHigh flexibilityPeak computing power is weakProgramming language is difficultASICFully customizedStrong average performanceLow power consumptionSmall sizeNot editableLong dev
11、elopment timeHigh technical riskAnother approach is to subvert the classic von Neumann computing architecture and use brain like neural structures to enhance computing power.Copyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written
12、permission of the publisher.AI Chip ClassificationCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.Dimension 1:Undertake TasksAccording to the different tasks undertaken,AI chips are divided into
13、 training chips and inference chips.Training chip:A complex neural network model needs to be trained through big data and trained systematically using labeled data to adapt to specific functions,emphasizing absolute computing power.Inference chip:Utilizing trained neural network models,using new dat
14、a for inference prediction,focusing on comprehensive indicators such as unit energy consumption,time delay,cost,etc.Training Chip VS Inference ChipTYPEFEATURESTraining ChipComputing chips used to build neural network models Training requires high computing performance(high throughput)and low power c
15、onsumptionMore emphasis is placed on absolute computing powerInference ChipMainly use trained neural network models for inference and predictionPay more attention to low latency and low power consumptionRequires relatively low computing powerAI Training And InferenceTraining-AI views many images and
16、 grasps the characteristics of dogs and cats.Inference-When AI sees an image,it can determine whether it is a cat or a dog.Copyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.If it is the image reco
17、gnition AI of cats and dogs,thenDimension 2:Deployment LocationThe cloud AI chips are responsible for processing and managing data and tasks of cloud computing,mainly implementing AI training and inference processes.The edge AI chips can achieve local processing and storage of data without the need
18、for internet connection,mainly achieving AI inference.The terminal AI chips is mainly responsible for executing user instructions and operations,as well as communicating and interacting with the cloud and edge,mainly realizing the artificial intelligence inference process.INDEXCLOUD AI CHIPEDGE AI C
19、HIPTERMINAL AI CHIPProcessing capacityHave the strongest processing powerCompromise in processing power compared to cloud chipsIn terms of processing ability,it is the weakest among the threeDelay High latencyReduces latencyExtremely low latencySecurityMore stringent security measures are neededHigh
20、er data securityMaximizing privacy protectionApplicationThe application of large-scale data processing and complex AI modelsApplications that require quick response and partial autonomous decision-making,Such as autonomous vehicleSuitable for small,personalized devicesCopyright 2023 by ICV TAnK.This
21、 work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.CloudEdgeTerminalCloudComputingDeviceEndpointCloud AI SystemThe load is concentrated on the AI chip of cloud computing equipment.The load of training and reasoning can be distri
22、buted to AI chips of edge computing devices.AI chips with load distributed at endpointsComputerMobile PhoneMotor controlEnter only necessary dataEnter onlynecessary dataEnter onlynecessary dataEdge AI SystemEndpoint AI SystemCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed i
23、n any form or by any means without express written permission of the publisher.Data InputAI OutputEdgeComputingEquipmentData InputAI OutputAI ChipData InputAI OutputAI ChipAI ChipAI ChipAI ChipComputerMobile PhoneMotor controlComputerMobile PhoneMotor controlAI Chip Market SizeCopyright 2023 by ICV
24、TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.0.761.131.702.553.835.747.469.7012.6116.4021.320.005.0010.0015.0020.0025.002017201820192020202120222023E 2024E 2025E 2026E2027ECAGR5 50%CAGR5 30%The Global AI Chips Mar
25、ket Is Growing RapidlyAs the AIGC industry enters a period of rapid development,the performance of AI large models has been continuously improved.As the core of computing power,AI chips have been able to grow rapidly driven by the development of the AIGC industry.The global AI chip market has grown
26、from 4.5 billion dollars in 2017 to 34.17 billion dollars in 2022.From the perspective of the Chinese market,the size of the AI chip market in China has increased from 0.76 billion dollars in 2017 to 5.74 billion dollars in 2022,with a compound growth rate of 50%.It is expected to occupy a global ma
27、rket share of 16.80%by 2027.China AI Chip Market($,Billion)Copyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.Global AI Chip Market($,Billion)CAGR5 50%CAGR5 30%CAGR5 50%CAGR5 30%4.506.7510.1315.192
28、2.7834.1744.4257.7575.0897.60126.880.0020.0040.0060.0080.00100.00120.00140.002017201820192020202120222023E2024E2025E2026E2027ECloud Training Chips Account ForThe Main MarketDriven by the continuous application and penetration of AI big models,cloud computing,and data centers,AI chips have developed
29、rapidly.Especially for cloud training chips,the global market size of cloud training chips reached 16.40 billion dollars in 2022,accounting for approximately 48%of the entire AI chip market.From another perspective,AI chips mainly include graphics processors(GPUs),field programmable gate arrays(FPGA
30、s),and specialized integrated circuits(ASICs).Among them,GPUs hold the main market share in AI chips,account for 84%of AI chip products in 2022,with a market size of 28.70 billion dollars.It is expected that this size will reach 101.50 billion dollars by 2027.Global AI Chip Segmentation Market($,Bil
31、lion)Market Of AI Chip With Different Architectures($,Billion)2.253.384.967.2910.9416.4020.4326.5734.5343.9257.100.360.811.222.283.426.158.8812.7116.5224.4031.721.892.573.955.628.4311.6215.1018.4824.0229.2838.060.0020.0040.0060.0080.00100.00120.00140.002017201820192020202120222023E2024E2025E2026E202
32、7ECloud trainingCloud inferenceEdge/Terminal inference3.835.748.6112.9119.3628.7036.8747.3660.8178.08101.500.410.610.911.372.053.424.445.787.519.7612.690.270.410.610.911.372.053.114.626.769.7612.690.0020.0040.0060.0080.00100.00120.00140.002017201820192020202120222023E 2024E 2025E 2026E 2027EGPUFPGAA
33、SICCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.AI Chip Market PlayersCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express wri
34、tten permission of the publisher.Edge/Terminal AI ChipGlobal AI chip industry mapCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.Cloud ChipIn the field of cloud training chips,Nvidia has an abso
35、lute advantage.Nvidias GPU+CUDA computing platform is currently the most mature AI cloud training solution,along with international vendors such as Google,Intel,AMD,and Xilinx.Due to their relatively late entry into this field,Chinese manufacturers are still primarily start-up companies and have not
36、 yet formed an influential cloud based training chip ecosystem.Compared with cloud training chips,cloud inference chips consider more comprehensive factors,including unit power consumption,latency,cost,etc.FPGA/ASIC chips have more outstanding performance and application advantages.In the cloud infe
37、rence chip market,international manufacturers such as Nvidia,Google,and Intel dominate the market,while Chinese companies such as Cambrian,BitContinent,and Huawei are also actively expanding their presence.Nvidia Has Absolute Advantage OfCloud ChipsCopyright 2023 by ICV TAnK.This work may not be rep
38、roduced or distributed in any form or by any means without express written permission of the publisher.At present,the edge/terminal inference chips are each in an array,and there is no chip manufacturer with an absolute dominant position.Among them,the AI terminal inference chip market in the securi
39、ty field is relatively stable,with Nvidia and Mobileye as industry leaders,and HiSilicon and Anba forming strong competition with them.The mobile phone market is mainly dominated by manufacturers of original control chips from Qualcomm,Huawei,and Apple;NVIDIA,TI,Renesas,and NXP are the main particip
40、ants in the field of autonomous driving.IoT ScenariosMobile InternetIntelligent SecurityAutonomous DrivingTask Image detectionVideo detectionspeech recognitionPhoto-scene recognitionAR applicationVoice AssistantImage detectionVideo detectionData fusionPlanningImage semantic segmentationEnergy Consum
41、ptionDeploy on-site power supply for device access and deploymentConsumer grade polymer lithium batteriesDeploy on-site power supply for device access and deploymentPower grade hard shell lithium batteryReliabilityHighMediumHighExtremely highManufacturersGoogle,NVIDIAApple,Samsung,ARMIngenic,Intell
42、if usionNVIDIA,IntelEdge AI Chips Have Development ProspectsCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.AI Chip Development TrendsCopyright 2023 by ICV TAnK.This work may not be reproduced o
43、r distributed in any form or by any means without express written permission of the publisher.AI Chips Are Moving Towards Edge AI ChipsWith more and more intelligent devices such as smart cars,smartwatches,smart appliances,VR/AR,etc,the amount of data collected by edges and terminals is increasing e
44、xponentially,and there are higher requirements for real-time response and low latency.More intelligent data processing will be completed on the edge side.Compared to cloud AI chips,edge AI chips can significantly reduce dependence on cloud computing resources,reduce network bandwidth consumption,red
45、uce data transmission latency,improve processing efficiency,and optimize user experience.However,diversified application scenarios also bring some problems to edge AI,such as the need to customize different algorithms for different application scenarios,and the requirements for chip computing power
46、and power consumption are also different.There are also situations where computing power,algorithms,and applications are fragmented.In the future,the integration of edge cloud computing power layout solutions will become mainstream,not only achieving optimization of algorithm structure,but also fund
47、amentally empowering various edge applications to provide more complete solutions.Copyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.About ICV TAnKAt ICV we are passionately curious about New Techn
48、ology,and we strive to deliver the most robust market data and insights,to help our customers make the right strategic decisions.We bring together the deepest intelligence across the widest set of capital-intensive industries and markets.By connecting data across variables,our analysts and industry
49、specialists present our customers with a richer,highly integrated view of their world.That is the benefit of The New Intelligence.We are able to isolate cause and effect,risk and opportunity in new ways that empower our customers to make well-informed decisions with greater confidence.Copyright 2023
50、 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.AI ChipCopyright 2023 by ICV TAnK.This work may not be reproduced or distributed in any form or by any means without express written permission of the publisher.Our officesTo find an office address,phone or fax number,please select a location from the list below.View our global offices.Canad5250 Fairwind Dr.Mississauga,Ontario,L5R 3H4,CanadaSingapore101 Upper Cross Street,#04-17,Peoples Park Centre,SingaporeMore research report in