《德勤(Deloitte):2025邊緣人工智能的未來研究報告:變革型影響(英文版)(12頁).pdf》由會員分享,可在線閱讀,更多相關《德勤(Deloitte):2025邊緣人工智能的未來研究報告:變革型影響(英文版)(12頁).pdf(12頁珍藏版)》請在三個皮匠報告上搜索。
1、Heading Open Sans Light across three lines of textReportsubtitle,date or author second lineThe future of Edge AI The transformative impact of Edge AI across various systems and sectors is emphasized through the synergy between localized data processing and advanced AI model deployment for real-time
2、decisioning and enhanced privacyIntroductionArtificial intelligence(AI)is transforming the world in unprecedented ways and redefining solutions across myriad sectors,from health care to entertainment,finance to manufacturing,and beyond.Different forms of AI enable this transformation,each with disti
3、nct characteristics and capabilities to provide specific solutions.Edge AI refers to where,what,and how AI computations are conducted,while Generative AI(GenAI)refers to what those computations accomplish.Edge AI enables organizations to deploy GenAI models at the source of data generation for local
4、 processing and real-time decision-making.GenAI,coupled with Edge AI,will unlock new value for businesses and consumers.For example,a smart speaker can use Edge AI to understand and respond to natural language commands while keeping the voice data secure and private.GenAI can enable the speaker to g
5、enerate personalized and engaging conversations,music,and stories based on the users preferences and feedback.GenAI and Edge AI are not just computing technologies in a data farm.They are powerful technologies that can create new services and applications that can enhance the value and experience of
6、 the consumers.Edge AI empowers organizations to access massive data sets and utilize AI to gain additional insights and drive new business models.The edge computing market is attracting various players,including public cloud providers,hardware and chip manufacturers,industrial goods companies,conte
7、nt distribution networks,telecom operators,and startups.According to Gartner,Generative AI will be a key feature of 60%of edge computing deployments by 2029,up from a mere 5%in 2023.1 This shows how quickly and widely GenAI technologies are being adopted,and how they enable advanced AI applications
8、to run on edge computing devices.The Edge AI market is projected to record a valuation of$269.82 billion by 2032,with a CAGR of 33.3%over the forecast period of 2024-2032.2 Organizations that integrate edge computing into their operational technology and build supporting ecosystems will likely be po
9、sitioned at the forefront of this technological revolution,driving industry differentiation and competitive advantage.The Edge AI ecosystem comprises an array of entities that extend from end-user devices to infrastructure/hardware,software companies,and cloud service providers and serves both consu
10、mer and B2B enterprises.This convergence of multiple technologies can lead to transformative advancements but simultaneously presents technical and operational challenges.Strategic and synergistic collaborations among key ecosystem players are essential to navigate these obstacles and bring computat
11、ional power and intelligence closer to the physical world.As GenAI adoption expands,Edge AI devices become more capable of generating and adapting to new data and scenarios,moving beyond executing predefined models and rules.This could prove valuable when accessing large data volumes is difficult du
12、e to privacy concerns or when training new data models is constrained due to increased resource costs.Public cloudDeviceEdgeCPE deviceCloudIoT sensorsGenAI on Edge/Edge AIConnectivityCentral officeHub siteBase stationEnterprise WANEnterprise cloud Private cloudDevices2The future of Edge AISuccessful
13、 implementation of Edge AI could enhance user experiences through real-time decisioning and improved privacy;however,adoption is contingent on navigating challenges of resource limitations and system architectures.The enhanced solution may lie in a hybrid environment that balances the advantages of
14、cloud and edge computing.Enhancing real-time applications through edge computingThe combination of Generative AI with edge computing creates powerful synergies.By leveraging the computational capabilities of edge devices such as smartphones,Internet of Things(IoT)sensors,and edge servers,Generative
15、AI models can be deployed directly at the point of data collection or consumption.This enables real-time,low-latency generation of content,such as images,text,or audio,tailored to the specific context or user preferences.Together,GenAI and edge computing enhance personalized recommendations,augmente
16、d reality experiences,and on-device language translation,offering unprecedented opportunities for efficiency,privacy,and seamless user interactions.Furthermore,Edge AI enables real-time analysis and decisioning,making it effective for applications requiring rapid responses or operating in disconnect
17、ed environments,such as smart home devices,autonomous vehicles,industrial IoT systems,and wearable health monitors.Generative AI models often require significant computational resources and memory with increasingly large model parameters and deep neural networks(DNNs),which may only sometimes be mit
18、igated through interventions of model pruning or quantization.In contrast,edge devices have a small form factor and thus may require some transmission to the cloud for additional processing of large Generative AI models.This creates the need for privacy preservation techniques,including adaptive DNN
19、 partitioning frameworks,homomorphic encryption,and secure multiparty computation.Furthermore,executing preservation techniques at scale requires specific data architecture.While cloud-only environments offer more scalability and accuracy,they may also result in more latency and exposure.Edge-only e
20、nvironments promote speed but often with more constraints and complexity.A hybrid environment may provide a balance based on the use case requirements.Edge computing addresses the limitations of traditional cloud-centric architectures by distributing computational resources closer to the data source
21、,reducing latency and bandwidth consumption.This proximity to data sources minimizes latency-sensitive applications dependency on centralized cloud servers,improving responsiveness and scalability.The integration of edge computing and Generative AI introduces novel opportunities and challenges.Organ
22、izations can leverage real-time data streams to produce context-aware and personalized content by deploying Generative AI models at the network edge.For instance,edge devices equipped with Generative AI in retail environments can dynamically generate tailored product recommendations based on custome
23、r preferences and behavior,enhancing the shopping experience.Moreover,edge computing can mitigate privacy concerns associated with centralized data processing by performing AI inference locally,keeping sensitive information within the network perimeter.This decentralized approach can enhance data pr
24、ivacy and reduce reliance on external communication,mitigating security risks associated with data transmission over public networks.3The future of Edge AIDistributed computing drives new experiences for consumers and enterprisesEnd-user devices such as smartphones,tablets,and sensors increasingly r
25、ely on more performant chips and AI technologies to provide highly personalized user experiences.End-user devices,like smartphones,wearables,intelligent cameras,and IoT gadgets,serve as the front-line interface between individuals and accelerated productivity.These devices have traditionally relied
26、on cloud-based AI services for tasks such as natural language processing,image recognition,and predictive analytics.However,the advent of Edge AI has shifted the computational burden from remote servers to the devices themselves,thereby enhancing AI efficiency.Moreover,it helps address privacy conce
27、rns by localizing data processing,an increasingly important consideration in todays data-sensitive world.These improvements unlock new operational benefits and enterprise value,preparing us for the native AI wave.For example:DeviceEdgeEnterpriseEdgeFarEdgeNearEdgeEdge data CenterCloudPrivacyHighHigh
28、LowLowLatencyPrivate cloudPublic cloudCSP core/data centerPersonal DevicesSensorsWearablesSmart cellGatewaysBase stationCPF DevicesEnterprise cloudHub sitesCentral offices4The future of Edge AIUser experienceEdge AI considerably improves user interaction by customizing design elements for specific a
29、pplications.This encompasses the development of robust devices suited for onsite use in sectors such as mining,construction,or military endeavors,allowing for on-device data processing independent of inconsistent field connectivity.Additionally,Edge AI facilitates the creation of compact devices for
30、 wearable technologies,offering quick health monitoring capabilities,like identifying abnormal heart rhythms and promptly notifying users without the need to connect to another device.LatencyEdge AI enables end-user devices to execute sophisticated AI tasks locally,reducing the dependency on constan
31、t connectivity to the cloud.For instance,smartphones can now execute AI-powered features like facial recognition and language translation promptly and accurately,even offline.Wearable devices equipped with Edge AI algorithms can monitor vital signs,detect anomalies,and provide real-time health insig
32、hts to users.Continuous glucose monitoring for people with diabetes empowers individuals to take proactive control of their health.In urgent cases like fall detection for the elderly,low latency enabled by Edge AI can enable potentially lifesaving decision-making by promptly alerting emergency conta
33、cts and services.This represents one of the numerous applications of Edge AI that addresses latency issues,demonstrating enhanced performance in use cases such as autonomous vehicles,industrial automation,and smart retail solutions,among others.PrivacyEdge AI is designed to reduce privacy and data e
34、xposure concerns associated with transmitting information to centralized servers by processing sensitive data on-device.End-user devices can analyze and act upon personal data locally without compromising user privacy or exposing sensitive information to external threats.With data and AI computation
35、s occurring on devices,there are fewer endpoints for security breaches.This decentralized approach to AI can help ensure greater security and confidentiality,fostering user trust and confidence.Cost benefits/enterprise valueEmbedding AI capabilities into devices can be economically beneficial.It can
36、 reduce the dependence on cloud-based services,saving data transfer and storage costs.It also enables offline AI functionalities,such as real-time language translation or advanced image recognition,that can improve customer engagement and offer a competitive edge.Although this strategy requires inve
37、stment in storage and computing power,it can also lower the costs of connectivity and data transfer.5The future of Edge AIOptimizing edge infrastructure for the digital ecosystemThe diversity of edge devices and solution-specific offerings by infrastructure and hardware providers will enable the dep
38、loyment of powerful AI models but may present issues of compatibility and widespread usability.With the advent of Edge AI,solution providers,including infrastructure suppliers,storage providers,and networking companies,are designing more fit-for-purpose environments optimized for scalability,reliabi
39、lity,and interoperability.Infrastructure and hardware providers enable robust Edge AI services by optimizing hardware for AI workloads,simplifying edge deployment,and tailoring solutions across industries based on use cases.The current Edge AI infrastructure market is robust and is witnessing increa
40、sed investments by providers.According to IDC,edge infrastructure spending is projected to grow from$25.3 billion in 2022 to$55.6 billion by 2027,at a CAGR of 17%.3 In 2022,networking and security represented the most significant edge infrastructure workload at$2.4 billion,but by 2027,content delive
41、ry is expected to take the lead at$4.9 billion in infrastructure spending that year.4 This shift is driven by the growing need to support content delivery in remote areas.For consumer workloads,services to interact with the environment,such as augmented reality/virtual reality(AR/VR)and digital sign
42、age,are projected to have infrastructure spending growth of 21.7%CAGR.5 For enterprise workloads,edge infrastructure spending on customer relationship management(CRM)will be the fastest growing with 19.5%CAGR,6 driven by the need for location-based contact management and more onsite retail support.I
43、ntegrating Edge AI with GenAI will require creating bespoke infrastructure and hardware solutions that warrant an evolution of all the entities across the technology ecosystem.For example:Semiconductors and chipmakers:The rapid advancements in semiconductor technology have produced more powerful,ene
44、rgy-efficient chips capable of independently executing sophisticated AI tasks.This hardware innovation has been propelled by breakthroughs in AI model design,pushing models that are more potent,compact,and efficient in energy and memory usage.Similarly,the demand for chips capable of supporting adva
45、nced AI tasks directly on devices is surging.Chipmakers and semiconductor companies are accelerating their production of specialized AI chips that support increased processing power and energy efficiency,enabling smarter,AI-driven devices.7 The passage of the$280 billion CHIPS(Creating Helpful Incen
46、tives to Produce Semiconductors)and Science Act is projected to catalyze further investments in advanced semiconductor chip manufacturing.8Original equipment manufacturers(OEMs):OEMs are integrating advanced chips into their devices,boosting the AI capabilities of smartphones,tablets,and IoT devices
47、.This trend is already observed in the latest consumer electronics,where AI features such as enhanced photography,voice assistants,and personalized recommendations are becoming standard.Independent software vendors(ISVs):ISVs are crucial in developing software that can leverage both on-device and cl
48、oud-based AI capabilities.The rise of hybrid AI is prompting ISVs to create more sophisticated AI applications that are adaptable to various hardware specifications and can operate in a distributed computing environment.The deployment of AI models combined with edge computing presents unique challen
49、ges and opportunities due to the heterogeneous nature of edge devices.As the market grows,there will likely be an increase in collaboration between ISVs and OEMs to create agile and adaptive cross-platform approaches for seamless integration across various hardware devices.For example,retail stores
50、can leverage a surveillance camera equipped with Edge AI to function independently without internet access or human monitoring,which can alert store managers about theft or vandalism.In health care,a doctor can generate a summary of patient notes while maintaining HIPAA compliance through an AI-enab
51、led end-user device or at the hospitals secure edge server.The opportunities for edge infrastructure providers are promising 6The future of Edge AIbut come with interoperability challenges,including standardization and integration complexity.The diversity of edge devices,from smartphones to IoT sens
52、ors,makes deployment standardization difficult.Enterprises will need the flexibility to support multiple hardware providers since solutions will be customized for efficiency.As Edge AI adoption grows across industries,hardware providers must provide software-agnostic or plug-and-play type services t
53、o support solutions across various use cases.One critical advantage of interoperability across multiple players in the edge ecosystem is the ability to support diverse hardware providers,allowing solutions to be tailored to specific client needs.Achieving interoperability will require compatibility
54、and reduced integration complexities,ultimately enabling a more cohesive and efficient edge computing environment.By addressing the challenges of infrastructure standardization and cross-provider interoperability,edge infrastructure providers will enable optimization for AI workloads and offer flexi
55、bility for fit-for-purpose environments.7The future of Edge AILeveraging cloud services to accelerate Edge AI8As Edge AI adoption grows,new solutions should involve more flexible and customizable offerings from hyperscalers and hybrid cloud architectures to balance localized real-time decision-makin
56、g and robust processing capabilities.The future of Edge AIEdge AI enables quick decision-making without relying solely on centralized cloud-based infrastructure.For example,in smart cities,Edge AI processes data from traffic cameras and sensors locally to manage traffic flow in real time,while in he
57、alth care,it allows portable diagnostic devices to analyze patient data instantly at the point of care.To enhance and complement the functionality of Edge AI deployments,cloud services offered by hyperscalers provide scalable resources that balance real-time localized analytics and the strong proces
58、sing capabilities of the cloud.Public cloud environments tend to offer comprehensive solutions due to their diverse and flexible consumption models.Major public cloud solution providers or hyperscalers are increasingly investing in expansive AI capabilities designed to accelerate on-demand access to
59、 compute,storage,and network applications.The redundancy and fault tolerance offered by public cloud providers play a crucial role in supporting the dependability required for immediate decision-making seen in scenarios like autonomous vehicles,where any latency or malfunction can lead to significan
60、t ramifications.One of the emerging trends is the capability of hyperscalers to work across the edge ecosystem.According to Gartner,nearly 5%of large enterprises will adopt a distributed cloud solution from hyperscalers by 2027 to execute edge computing workloads outside traditional data centers.Hyp
61、erscalers have adopted a multifaceted edge strategy that leverages their cloud infrastructure,edge devices,and AI capabilities.They offer a solid framework for connecting edge solutions with cloud services,enabling smooth data processing,storage,and analytics.Hyperscalers increasingly invest in appl
62、ication-specific integrated circuits to deliver powerful AI capabilities to edge devices.Furthermore,the edge platforms provided by hyperscalers ensure a secure link and governance of IoT devices,allowing for the gathering and examination of data at the edge in cases such as smart agriculture,where
63、sensors are employed to track soil conditions and the welfare of crops.Despite the potential benefits of leveraging cloud capabilities for Edge AI,it is crucial to evaluate specific considerations before implementing this approach.Edge AI deployments relying on hyperscalers are inherently dependent
64、on network connectivity.This dependency can present issues in remote or bandwidth-constrained environments,potentially limiting the availability and effectiveness of Edge AI applications.Furthermore,adopting a specific hyperscaler for Edge AI can lead to the issue of vendor lock-in,where organizatio
65、ns become reliant on a single providers proprietary technologies and services,restricting customization and hindering interoperability with other platforms.In the case of private cloud deployments,organizations achieve greater security and data sovereignty by hosting AI workloads on-premises or in d
66、edicated infrastructure.However,building and maintaining a private cloud environment,such as an onsite data center for financial institutions or health care organizations,entails a significant initial investment and ongoing operational expenses.This could limit scalability and agility,particularly f
67、or Edge AI deployments spread across multiple locations.In conclusion,integrating cloud services and Edge AI presents opportunities for robust processing capabilities and enhanced real-time decision-making.However,hyperscalers should offer more adaptable solutions for the diverse range of edge compu
68、ting applications and use cases to capitalize on these opportunities fully.Organizations should carefully weigh the pros and cons of using cloud services for Edge AI and select options that enable targeted transformation to meet their specific needs and goals.The road ahead for Edge AIEdge AI presen
69、ts exciting opportunities and challenges for various ecosystem players who must collaborate,innovate,and upskill to leverage its benefits.The decentralization of AI through Edge AI represents a profound shift in the technological landscape.Applications are inherently designed to leverage distributed
70、 computing resources by optimizing cloud and edge device performance.This improves efficiency and helps ensure that AI applications are more resilient and adaptable to varying operational conditions.As this paradigm evolves,it will likely open new pathways for innovation and reshape how businesses o
71、perate across all sectors,making AI an integral and transformative component of our digital lives.The future of Edge AI holds exciting prospects with crucial benefits for end users and ecosystem players,such as lower latency in processing,increased security,cost-effectiveness across solutions,and im
72、proved user experience.To realize these benefits,ecosystem players must navigate challenges such as interoperability and standardization across infrastructure providers and cloud-based solutions.The transition to Edge AI necessitates synergistic collaboration across ecosystem players,from chip manuf
73、acturers to cloud service providers,to develop flexible solutions.To foster this collaboration and advanced innovation,Edge AI players should consider a targeted approach that invests in research and development,builds partnership alliances,and establishes a culture of ongoing learning and adaptatio
74、n.As industry demand grows,the increased need for Edge AI talent poses another obstacle.The market demand for professionals with specific technical skills outpaces the supply.The convergence of GenAI and edge computing has created a need to upskill the existing workforce or use seasoned system integ
75、rators with subject matter expertise and end-to-end implementation capabilities.9The future of Edge AIThe future of Edge AIAuthors Acknowledgments10Rahul BajpaiPrincipal Deloitte Consulting LLPBaris Sarer Principal Deloitte Consulting LLPNakul Mate Manager Deloitte CNeha DasariConsultant Deloitte Co
76、nsulting LLPArpan Tiwari Managing DirectorDeloitte Consulting LLPMark Szarka Senior ManagerDeloitte C Edem Isliamov Senior ConsultantDeloitte Consulting LLPThe authors would like to thank Dan Littmann,Stacy Hodgins,Brandon Kulik,Allison Smith,Narayanan Narasimhan,Ankit Shanker,and Manish Rajendran f
77、or their contributions to the research and insights of this report.The future of Edge AIEndnotes1.Thomas Bittman,et al,Market guide for edge computing,Gartner,March 2024.2.Fortune Business Insights,Edge AI market size,share&industry analysis,Fortune Business Insights,August 2024.3.Max Pepper and Jen
78、nifer Cooke,Worldwide Edge Enterprise Infrastructure Workloads Forecast,20232027,IDC,October 2023.4.Ibid.5.Ibid.6.Ibid.7.Amber Wang and Qasim Nauman,“Intel unveils new AI chips at Computex amid rivalry with Nvidia,AMD,Qualcomm,”Tech Explore,June 4,2024.8.The White House,“Fact Sheet:CHIPS and Science
79、 Act will lower costs,create jobs,strengthen supply chains,and counter China,”August 9,2022.11As used in this document,“Deloitte”means Deloitte Consulting LLP,a subsidiary of Deloitte LLP.Please see for a detailed description of our legal structure.Certain services may not be available to attest cli
80、ents under the rules and regulations of public accounting.This publication contains general information only and Deloitte is not,by means of this publication,rendering accounting,business,financial,investment,legal,tax,or other professional advice or services.This publication is not a substitute for
81、 such professional advice or services,nor should it be used as a basis for any decision or action that may affect your business.Before making any decision or taking any action that may affect your business,you should consult a qualified professional advisor.Deloitte shall not be responsible for any
82、loss sustained by any person who relies on this publication.About DeloitteDeloitte refers to one or more of Deloitte Touche Tohmatsu Limited,a UK private company limited by guarantee(“DTTL”),its network of member firms,and their related entities.DTTL and each of its member firms are legally separate
83、 and independent entities.DTTL(also referred to as“Deloitte Global”)does not provide services to clients.In the United States,Deloitte refers to one or more of the US member firms of DTTL,their related entities that operate using the“Deloitte”name in the United States,and their respective affiliates.Certain services may not be available to attest clients under the rules and regulations of public accounting.Please see to learn more about our global network of member firms.Copyright 2025 Deloitte Development LLC.All rights reserved.9616011