《Wevolver:2023年自動駕駛汽車技術報告(英文版)(59頁).pdf》由會員分享,可在線閱讀,更多相關《Wevolver:2023年自動駕駛汽車技術報告(英文版)(59頁).pdf(59頁珍藏版)》請在三個皮匠報告上搜索。
1、2023AUTONOMOUSVEHICLETECHNOLOGYREPORTExamining the Latest Developments in Self-Driving VehiclesAbout the Contributors ForewordState of the Art in Autonomous Vehicles TechnologiesSensing TechnologiesCameras and Vision systemsHarnessing AI-Enhanced VisionProminent Companies Developing AV Vision System
2、sInterview:Insights from Mouser ElectronicsLiDARLiDAR Product OverviewCompanies Developing LiDAR Technologies for AVsInterview:Insights from MurataRADARMillimeter Wave RADARsCompanies Developing RADAR Technologies for AVsInterview:Insights from MacroFabUltrasonic SensorsInterview:Insights from Nexpe
3、riaThinking and LearningFrontiers of AI Learning Approaches for AVs NLP and GANs Reshaping Autonomous DrivingHarnessing the Power of LLMs for AV ApplicationsCompanies Developing AI Algorithms for AV ApplicationsInterview:Insights from NVIDIAEdge ComputingCompanies Developing Edge Computing for AVsRe
4、al-time Operating Systems for Autonomous VehiclesAdvancements in RTOS Systems for AVsInterview:Insights from Autoware FoundationCommunication and ConnectivityVehicle Communication5G ConnectivityInnovations in 5G for AV ApplicationsFuture Connectivity StandardsSecuritySecuring AVs with BlockchainComp
5、anies Developing Security Solutions for AVsInterview:Insights from SAE47899111416202021243033333646485151535455566567686871747476777778808182Autonomous Vehicle Tech Stack ReviewWaymoTeslaCruiseVolvoReport SummarySponsor PagesMouser ElectronicsMurataMacroFabNexperiaADLINKSAEPartnersAutoware Foundatio
6、nAbout WevolverReferences8688919396981001001021041061071081101101121145About the ContributorsCassiano Ferro MoraesFlorianpolis,BrazilTechnical writer with an electrical engineering background and over five years of experience writing articles about electronics,renewable energy,and electric vehicles.
7、Published multi-ple articles in IEEE explore,including publications in some of the most recognized conferences and journals of the power electronics industry.Vast experience writing articles in the field of electronics,renewable energy,pow-er converters,hardware-in-the-loop,and electric vehicles at
8、Write Choice.Dr.Miroslav MilovanovicNi,SerbiaAssistant professor at the Faculty of Electronic Engineering at the Univer-sity of Ni,holding a PhD in Computer Science and Electrotechnics.Leader the Laboratory for Intelligent Control within the Control Systems Depart-ment.Author of over 45 scientific p
9、ublications,centered on Data Science and Deep Learning applications.Gustavo BruismaPato Branco,BrazilElectrical Engineer graduated at the Federal Technological University of Paran(UTFPR)with MSc.degree in the same field with a deep focus on control systems for electric vehicles.Vast experience writi
10、ng articles in the field of electronics,renewable energy,power converters,hardware-in-the-loop,and electric vehicles at Write Choice.Ana Carla SorgatoFlorianpolis,BrazilEnvironmental and Sanitary Engineer graduated at the Federal University of Santa Maria(UFSM)with a MSc.in En-vironmental Engineerin
11、g from UFSC.Currently pursuing a Ph.D.in Environ-mental Engineering at the Federal University of Santa Catarina(UFSC)and writing articles about renewable energy,electric vehicles,and engineer-ing at Write Choice.Ian DicksonLondon,United KingdomFreelance automotive journalist and editor with more tha
12、n two decades experience in consumer media and branded content.Cutting his teeth as a road tester on What Car?where he learnt the art of turning the complex into the simple he quickly moved through the ranks and ended up editor of MSN Cars,at the time the UKs largest motoring website.For the past 10
13、 years,hes been creating and developing content and strategies for brands like Porsche,Ferrari,Volvo Cars and HERE Technologies.Samir JaberLeipzig,GermanyContent Specialist with a background in engineering,nanotechnology,and scientific research.Samir has com-prehensive experience working with major
14、engineering and technology companies as a writer,editor,and digital marketing consultant.Featured author in 30+industrial magazines with a focus on IoT,nanotechnolo-gy,materials science,engineering,and sustainability.Samir is also an award-winning engineering researcher in the fields of nanofabricat
15、ion and microfluidics.Editor-in-Chief of the Wevovler 2023 Edge AI Report.Danny ShapiroRedwood City,United StatesDanny Shapiro is NVIDIAs Vice President of Automotive,focusing on solutions that enable faster and better design of automobiles,as well as in-vehicle solutions for infotainment,navigation
16、 and driver assistance.Hes a 25-year veteran of the computer graphics and semiconductor industries,and has been with NVIDIA since 2009.Prior to NVIDIA,Danny served in mar-keting,business development and en-gineering roles at ATI,3Dlabs,Silicon Graphics and Digital Equipment.He holds a BSE in electri
17、cal engineering and computer science from Princeton University and an MBA from the Hass School of Business at UC Berkeley.He lives in Northern California,where his home solar panel system charges his electric car.Alexander WischnewskiMunich,GermanyManaging Director and Co-Founder of,driveblocks,a mo
18、dular,scalable,robust and safe platform for autonomous driv-ing with a focus on commercial vehicle applications.Former PhD lead for the TUM Autonomous Motorsport team.Matteo BaraleMilan,ItalyCo-CEO of autonomous mobility start-up,PIX MOVING.Design strategy expert with experience in robotics,industri
19、al design,transportation,architecture,titled Technology Pioneer by World Economic Forum.David WebbLondon,United KingdomHead of Innovation at Centre for Con-nected and Autonomous Vehicles.Da-vid Webb is Head of Innovation for the UK Governments Centre for Connected&Autonomous Vehicles.CCAV works acro
20、ss government to support the UKs developing connected and automated vehicle market.CCAV believes that CAVS could change the way we travel,making road transport safer,smoother and more accessible to all.David has a Masters in Aerospace Engineering from Queen Mary,University of Lon-don and has spent t
21、he last 10 years working for the Ministry of Defence in a variety of technical,analytical and engagement roles.John SoldatosAthens,GreeceHonorary Research Fellow at the University of GlasgowJohn Soldatos holds a Ph.D.in Electrical&Com-puter Engineering from the National Technical University of Athen
22、s(2000)and is currently an Honorary Research Fellow at the University of Glasgow,UK(2014-present).He was Associate Professor and Head of the Internet of Things(IoT)Group at the Athens Information Technology(AIT),Greece(20062019),and Adjunct Professor at the Carnegie Mellon University,Pitts-burgh,PA(
23、20072010).He has significant experience work-ing closely with large multi-national industries(e.g.,IBM,INTRACOM,INTRASOFT International)as an R&D consultant and delivery specialist while being a scientific advisor to various high-tech startup enterprises.Dr.Soldatos is an expert in Inter-net-of-Thin
24、gs(IoT)and Artificial Intel-ligence(AI)technologies and applica-tions,including IoT/AI applications in smart cities,finance(Finance 4.0),and industry(Industry 4.0).Jess MileyBerlin,Germany Content Director at Wevolver.Content Specialist with a background in architecture and design.Jess has experienc
25、e working with neuro-tech startups,animated video studios and news sites as a writer,editor,and busi-ness innovation consultant.67In 2020,Wevolver launched its first Autonomous Vehicle Report that provided a comprehensive knowledge foundation about the technologies enabling autonomous cars.We are no
26、w pleased to launch a new report that surveys the advances in this arena over the last three years.Further,we provide a snapshot comparison of four leading AV companies,their tech stacks,and approaches to understand how these technologies are applied in actual use cases.The report is augmented with
27、interviews with the report sponsors,who provide deeper insights into the current state of au-tonomous vehicles,highlighting their priorities,challenges,and leadership objectives.The report examines autonomy from the perspective of passenger vehicles,following the approach of the previous report.Howe
28、ver,many of the tech-nologies mentioned are also relevant for other autonomous vehicle types that are making a significant impact in industries and applications such as lastmile delivery,warehouse and logistics,agriculture,mining,search and rescue,and healthcare.To create this report,we interviewed
29、dozens of industry experts and collaborated with technical researchers and writers from around the world.We have attempted to cover all relevant technologies;however,due to space constraints,we had to limit some areas.This report was made possible by the generous support of its sponsors,the tireless
30、 effort of the Wevolver team,the expertise and generosity of our consulting experts,and the attention to detail of our writers,researchers,and designers.We hope you find value in this report,and we look forward to continuing to make this critical knowledge available for all.Foreword89The core of thi
31、s report is to make clear the current status of the technolo-gies that form autonomous vehicles.We have separated the chapters into groups covering Sensing,where we take a closer look at the latest advanc-es in cameras,LiDAR,RADAR,ultra-sonic sensors,and emerging imaging radar technologies.The Think
32、ing and Learning and Edge Computing chap-ters examine the dynamic landscape that encompasses advanced AI algo-rithms,natural language processing(NLP),machine learning techniques,and the transformative impact of edge computing.Finally,we explore the technologies that ensure reliable communica-tion,fr
33、om rapid 5G Connectivity and dynamic Over-the-Air(OTA)Updates,to the use of Blockchain,as well as Intrusion Detection and Prevention Systems(IDPS)and AI/ML-driven cybersecurity.Each section highlights recent innova-tions,outlining why certain technolo-gies have become dominant and gives examples of
34、which companies are prominent in the area.We provide some high-level defini-tions and explanations,but the first Wevolver report provides more funda-mental knowledge of the technologies.The reports final chapter looks at four leading autonomous vehicle compa-nies:Waymo,Tesla,Cruise,and Volvo.We comp
35、are and contrast their tech stacks presenting a clear overview of the direction of the industry.State of the Art in Autonomous Vehicles TechnologiesSensing TechnologiesAt the cutting edge of autonomous ve-hicle(AV)technology,the confluence of advanced sensing modalities forms the cornerstone of vehi
36、cular autono-my.At the forefront of this confluence lies the integration of high-definition cameras with a suite of diverse sen-sors,including ultrasonic,LiDAR,and RADAR.This amalgamation,known as sensor fusion,represents the zenith of current efforts to endow vehicles with perception capabilities n
37、ecessary for full autonomous driving.High-definition cameras,quintessen-tial for their acute visual acuity and color discernment,play an indispen-sable role in this sensorial symphony.They excel in interpreting complex visual stimuli from the nuanced hues of traffic lights to the intri-cate patterns
38、 of road signs.Yet,the prowess of cameras is not without its Achilles heel;their performance can wane under the cloak of night or in the face of inclement weather.It is within these gaps that the orches-tration of sensor fusion becomes critically imperative.At the cutting edge of sensor integra-tion
39、,the marriage of ultrasonic sensors with LiDAR and RADAR is address-ing the erstwhile shortcomings of standalone systems.This integration is particularly pivotal in surmounting the challenges of close-range detection a realm where traditional LiDAR sensors often falter.Such precision in proximal per
40、ception is vital for executing complex parking maneuvers and navigating through constricted spaces with unerring accuracy.The collaborative dynamics between ultrasonic and LiDAR sensors forge a more robust interpretative frame-work.While LiDAR imparts a detailed topographical map of the vehicles sur
41、roundings,it is occasionally prone to misinterpretations,especially in the presence of reflective surfaces or atyp-ical object contours.Here,ultrasonic sensors contribute a deeper dimension of spatial awareness,validating and refining LiDARs data,thus mitigating the risks of erroneous object recogni
42、-tion.Extending this synergy further,the integration of ultrasonic sensors with RADAR technology heralds a new era in perception systems capable of straddling the spectrum of short-and long-range detection.RADAR,with its broader wave patterns,often struggles with pinpoint accuracy in proximate scena
43、rios.Ultrasonic technology deftly fills this void,granting AVs enhanced situational awareness an attribute of paramount importance in scenarios that demand a harmonious blend of both near and distant perception,such as highway navigation interspersed with intricate parking sequences.In this avant-ga
44、rde realm,vehi-cle manufacturers are not merely choosing between sensor technol-ogies;rather,they are strategically orchestrating an ensemble of LiDAR variants,each contributing its unique strengths to the collective sensory intelligence of AVs.The selection of specific LiDAR models is no longer a m
45、ere technical choice but a strategic decision,influenced by a myriad of factors including application-specific requisites,cost-benefit analyses,and the relentless march of technological innovation.This chapter aims to delve into the intricate and sophisticated world of sensing and vision technologie
46、s in autonomous vehicles.We will explore how the nuanced integration of cam-eras with ultrasonic,LiDAR,and RA-DAR sensors is sculpting the frontier of autonomous navigation,steering us towards an era of unprecedented vehicular intelligence and autonomy.Cameras and Vision systemsCameras have a founda
47、tional and technically intricate position within autonomous vehicles,functioning as primary sensors to provide vital visual data for perception and navigation systems.Their role extends beyond mere image capture,encompassing intricate computer vision processes to interpret the surroundings with pixe
48、l-level precision.Cameras are instrumental in critical tasks,including real-time lane detection,object recog-nition,and complex depth perception,making them indispensable for AV safety and operational efficiency.In the last three years,there have been significant advancements in high-resolution came
49、ras,which have shown a remarkable increase in their ability to capture fine details.This,in turn,has enabled autonomous vehi-cles to accurately identify objects in their surroundings,making them more reliable and safe.Therefore,in this section,we explore developments in vision technology that have i
50、mpacted AV development over the last three years.1011The 3D stereo vision deployed on the autonomous race cars.Image credit:Nerian.3D Stereo Vision3D stereo vision technology utilizes two cameras to determine the depth and precise positioning of objects in the environment.This is similar to how huma
51、ns use binocular vision for depth perception.They are an inte-gral part of the future of autonomous vehicles by enabling them to navigate roads more safely than single cameras.The technology has seen rapid growth over the past decade,with significant strides being made by companies that are enabling
52、 automakers to quickly and inexpensively add 3D Stereo Vision to existing Advanced driver as-sistance systems(ADAS)with software solutions.The positioning of cameras in vehi-cles is an ongoing topic within the industry.Wider-placed cameras have the potential to fall out of alignment when impacted by
53、 temperature shifts in the chassis or road vibrations-an issue when the cameras need to main-tain an alignment within one-100th of a degree.Major players,such as Subarus EyeSight and the Drive pilot system in Mercedes EQS,use stereo vision systems deployed in tighter forma-tions to negate that-those
54、 systems are working in tandem with RADAR.Stereo vision is an ever-growing tech-nology,with researchers and develop-ers exploring new ways to improve its accuracy,efficiency,and field of view.The biggest impacts will likely come from deep learning and neural net-works being used to handle occlusion
55、and calibration issues.Other interest-ing areas of research include active stereo vision,which is being employed to project patterns or signals onto the scene,creating artificial texture and contrast.Some of this cutting-edge research is being tested by university teams on the race track.For example
56、,the Formula Student racing team of the University of Bayreuth is using Neri-ans SceneScan Pro and the Karmin3 stereo camera to create a 3D stereo vision system for their autonomous racing car.2,3Thermal camerasIn the early 2000s,several notable car manufacturers,including General Mo-tors,BMW,and Ho
57、nda,blazed the trail by introducing passive thermal camer-as to enhance safety during nighttime driving.These innovative thermal cameras were designed to address the dangers posed by animal collisions and the risk of pedestrian accidents in poorly lit or foggy areas.Their primary purpose was to prov
58、ide invaluable assistance to human drivers.However,the landscape of auton-omous driving began to evolve significantly with the advent of the DARPA Grand Challenge.This com-petition sparked a surge of interest and substantial investment in various sensing technologies.Among them,LiDAR(Light Detection
59、 and Ranging)emerged as the frontrunner,captur-ing the lions share of attention and financial support.Together with radar and visible cameras,this sensor suite gained widespread recognition as the optimal perception stack for achieving higher levels of autonomy.In an effort to bolster their sensor c
60、apabilities,certain companies are in-corporating thermal cameras into their sensor suites,recognizing the unique advantages they offer in complement-ing LiDAR,radar,and visible cameras.This additional sensor modality proves invaluable in addressing specific chal-lenges,such as identifying animals an
61、d humans in environments charac-terized by low light or heavy obscu-rants like fog,smoke,or steam.Pedestrians are most at risk of an accident with a road vehicle after dark.More pedestrian fatalities occurred in the dark(75%)than in daylight(21%),dusk(2%),and dawn(2%).4Notably,pioneers like Waymo Vi
62、a and Plus.ai have harnessed the power of thermal cameras to advance autonomy in the realm of trucking,particularly on highways.By doing so,they are enhancing safety and efficiency in long-haul transportation.Companies like Nuro,Cruise,and Zoox have adopted thermal cameras as part of their sensor re
63、pertoire for purpose-built vehicles designed to navigate the intricate landscapes of densely populated urban areas.These vehicles are not only revolutionizing last-mile food and grocery delivery but also providing innovative solutions in the realm of ride-hailing services.Through the strategic deplo
64、yment of thermal cameras,these companies are significantly elevating the safety and effectiveness of their operations within urban environments.Harnessing AI-Enhanced VisionTraditional cameras capture raw visual data,which requires subsequent pro-cessing and interpretation to derive meaningful infor
65、mation about the surroundings.AI algorithms,especially deep learning models,have revolu-tionized this process by enabling cam-eras to interpret visual information from their surroundings,enhancing their ability to comprehend images.The integration of AI-enhanced vision represents a groundbreaking de
66、velop-ment that significantly improves the capabilities of camera systems in AVs.For example,HADAR,an AI-powered A pedestrian crossing a dark suburban street.Visible light camera vs.FLIR thermal camera captured by Foresights test vehicle.Image credit:Foresight.1213thermal imaging system created by P
67、urdue and Michigan State University researchers,provides clear thermal im-ages by interpreting heat signatures.It significantly improves AVs and robots by resolving the blurring ghosting effect seen in traditional thermal imaging.Moreover,Omniq has recently launched a face detection feature for AVs,
68、improving safety by recognizing faces to prevent crimes.Their AI uses neural network algorithms for smart decision-making and has already seen over 20,000 global installations.In a collaborative effort,SemiDrive and Kankan Tech are improving in-car im-aging systems,where SemiDrives X9 chip powers th
69、e systems and Kankan Tech provides comprehensive develop-ment services.Kankan Tech has expertise in high-res-olution cabin cameras and has devel-oped a camera-based alternative to traditional rearview mirrors.Theyve also introduced palm vein biometric recognition for AV access.The system,unaffected
70、by lighting changes due to IR cameras,uses YOLO v7 algorithms for real-time face detection,analyzing facial expressions and head orienta-tion for safety,with plans for commer-cial market integration after thorough testing.Cameras,empowered by convolutional neural networks(CNNs)and appro-priate class
71、ification ML techniques,enable AVs vision systems to accu-rately identify and categorize objects,pedestrians,road signs,and lane markings.This level of understanding improves the vehicles ability to make informed decisions in complex and dynamic traffic scenarios.AI-enhanced vision is crucial in aut
72、onomous vehicles,encompassing tasks like object identification,motion tracking,and classification.This technology significantly augments AVs understanding of their surroundings,resulting in more informed and secure decision-making processes.5 An illustrative example of the po-tential of AI-enhanced
73、vision comes from the research conducted at RIKEN in 2023.Their innovative approach,inspired by human brain memory for-mation techniques,involves degrading the quality of high-resolution images for training algorithms in self-super-vised learning.This method enhances the algorithms ability to identi
74、fy objects in low-resolution images,addressing a notable challenge in the field of computer vision.6Furthermore,researchers at Purdue University and Michigan State Univer-sity have introduced a groundbreaking AI-enhanced camera imaging system known as HADAR(heat-assisted detec-tion and ranging).HADA
75、R utilizes AI to interpret heat signatures,effectively resolving issues such as ghosting that are commonly associated with thermal imaging.Its applications span a wide Comparison between ghosting thermal vision and HADAR TeX vision.Image credit:NVIDIAspectrum,from enhancing the per-ception of AVs an
76、d robots to enabling touchless security screenings at public events.7Another example comes from NVIDIA,which has developed a pixel-level segmentation approach using a single deep neural network(DNN)to achieve comprehensive scene understanding.This technology can divide a scene into various object ca
77、tegories and identify distinct instances of these categories,as reflected in the lower panels colors and numbers.The benefits of this technology are far-reaching,including reductions in training data,improved perception,and support for the safe operation of autonomous vehicles.These inno-vations col
78、lectively underscore the transformative potential of AI-en-hanced vision in shaping the future of autonomous vehicles and related technologies.9“We have algorithms that are reading for lanes,but theres also an object detection,but then theres also a DNN we call free space.Which is looking for the ab
79、sence of objects.”Danny Shapiro,VP of Automotive at NVIDIAPanoptic segmentation DNN output from in-car inference on embedded AGX platform.Top:predicted objects and object classes(blue=cars;green=drivable space;red=pedestrians).Bottom:predicted object-class instances along with computed bounding boxe
80、s(shown in different colors and instance IDs).Image credit:NVIDIA1415Prominent Companies Developing AV Vision SystemsThis section highlights some of the cutting-edge vision systems currently enabling the development of AVs.Mobileye Mobileye uses a variety of cameras within its vision-based driver as
81、sis-tance systems,including fisheye cam-eras,wide-angle cameras,and thermal cameras.10 In 2023,Mobileye launched the first camera-based Intelligent speed assist that complies with the new EU standards.Their technology,which only uses cameras,has received official approval throughout Europe,making it
82、 the first of its kind.Mobil-eyes technology can recognize various traffic signs,aiding Intelligent Speed Assist systems by using cameras alone.It relies on Mobileyes 400-petabyte database of global driving footage to swiftly meet increasing automotive safety standards.Continental Continental develo
83、ps various camer-as,including fisheye,wide-angle,and thermal cameras.These cameras are designed to meet the specific require-ments of different AV applications.More specifically,the surround-view camera features fisheye optics for a short-range view,and it supports Eth-ernet or LVDS communication.In
84、 November 2022,Continental and Ambarella entered a collaboration to co-develop hardware and software solutions based on AI for assisted and automated driving.The partnership aims to produce products for global series production by 2026,addressing the increasing demand for assisted and automated driv
85、ing technologies.The collaboration focuses on cam-era-based perception solutions for ad-vanced driver assistance systems and scalable full-stack systems for vehicles with Level 2+and higher autonomy.TIER IVTIER IV,is an open-source autonomous driving technology company who are their expanding produc
86、tion based on the huge interest in Automotive HDR Camera C1 which launched in 2022.Mobileye SuperVision diagram presenting the components and coverage of the camera array.Image credit:MobileEyeThe camera is designed for autono-mous mobility applications and has gained widespread adoption in various
87、fields,including autonomous driving,driver assistance,autonomous mobile robots,security,and surveillance.These applications are possible thanks to its impressive 120dB high dynamic range and high-quality automo-tive-grade hardware.Over 100 companies worldwide have implemented the C1 Camera.Building
88、on the success of the C1,in June 2023 TIER IV introduces the C2 Camera,a superior model with double the res-olution at 5.4 megapixels,improving its capabilities in distant objects and signal recognition.Finally,TIER IV is developing the C3 Camera featuring an 8-megapixel image sensor to meet the dem
89、ands of high-speed applica-tions such as highway driving.The goal is to complete its development within the year and start providing it in early 2024.Continentals AV advanced camera solutions.Image credit:Continental1617SPONSOR INTERVIEW Mark PatrickDirector Technical Content,EMEA at Mouser Electron
90、icsAccelerating AV Development Through Customer CollaborationMark,could you please describe your role at Mouser and the companys activities?Mark Patrick:I oversee technical content for Mouser in EMEA.This role encompasses not just written content but also involves projects and event booth developmen
91、t aimed at engaging our primarily technical audience with relevant content and activities.Our goal is to both inspire and inform our audience,providing ideas and guidance on various projects,offering detailed instructions,code,hardware,and related materi-als.Ultimately,we are a technical distributor
92、,supplying highly technical products to design engineers and component buyers.The technical marketing aspect is inherent in what we do,as authenticity is crucial when addressing our audience.This requires the collaboration of engineering and content teams.My background lies in semiconductors,technic
93、al sales,and application support.My engineering team consists of masters-lev-el electrical engineering students from the Technical University of Munich.They are a young,dynamic team capable of diverse tasks.Thank you for the introduction.Can you elaborate on Mousers activities?Mark Patrick:Certainly
94、.In simple terms,Mouser is a global distributor with full authori-zation from all our manufacturers.We stock products from around 1,200 manufacturers,including well-known brands and specialized niche players.Our commerce platform allows anyone to purchase products needed for their development,design
95、,or produc-tion processes.You serve a wide range of customers,from OEMs to startups,correct?Mark Patrick:Our customer base is diverse,ranging from individuals,including DIY en-thusiasts often referred to as Fred in the shed,to small consultancies,and up to large corporations and OEMs.This includes w
96、ell-known companies like Google and Apple who seek the convenience of our services.Could you tell us about Mousers unique selling proposition(USP)?Mark Patrick:There are other organizations similar to Mouser,but our USP lies in our focus on new product introduction.We lead our marketing efforts with
97、 the latest products and designs.We maintain over a million individual part numbers in stock at one location,ensuring a high-quality customer experience.This means that what you see on our website is readily available.We offer authentic and traceable components,a critical factor in light of recent s
98、upply chain concerns.Our vast inventory and commit-ment to offering new and innovative products set us apart.How do you stay updated on the latest products and technologies?Mark Patrick:We collaborate closely with our suppliers,maintaining a relationship that provides insights into their upcoming pr
99、oducts and release schedules.We are prepared to create content,including technical details,for these products,ensuring we can go live as soon as the product hits the market.Additionally,we create content that becomes highly visible on search engines,helping customers find these new products quickly.
100、This way,we facilitate access to technology and introduce it to new customers.What are the benefits of Mousers services for your customers?Mark Patrick:Our customers,often engineers working on designs,require rapid access to products,particularly during testing,proof of concept,and prototyping phase
101、s.They need assurance that the products are readily available.We offer this level of trust through our website,in-stock inventory,and our ability to deliver products within two to three days worldwide from a single location in the US.This high-quality service hinges on the convenience and ease of fi
102、nding products,combined with our informa-tive and inspirational content.With the increasing complexity of vehicles,particularly in the context of autonomous cars,how important is Mousers service in providing the necessary technology and components?Mark Patrick:Autonomous vehicles rely on various tec
103、hnologies,many of which require semiconductors for their functionality.The increasing capabilities and functionalities in modern vehicles are directly enabled by semiconductors.These components are now essential for even basic features like reversing cameras,navigation systems,audio systems,and safe
104、ty features.Semiconductors play a critical role in processing the data generated by sensors and providing a seamless user experience.As the automotive industry continues to innovate,the role of semiconductors will only grow.If an OEM needs a specific part that doesnt exist,can Mouser assist in facil
105、itating the creation of such parts?1819Mark Patrick:While we primarily stock standard products,we do collaborate with cus-tomers and tech support to address inquiries about specific components.However,for OEMs developing entirely new components,it is more common to work directly with manufacturers t
106、o create bespoke components,particularly given the scale of produc-tion in the automotive industry.Regarding autonomous vehicles,do you think many of the underlying technologies and components are shared across different manufacturers?Mark Patrick:Yes,there are common components and technologies tha
107、t serve specific functions in autonomous vehicles,such as connectors,semiconductors,and sensors.Many of these components are not exclusive to a single manufacturer.However,there can be custom parts created for specific OEMs.In general,a wide range of standard components can be used to build various
108、aspects of autonomous vehicle systems,with the focus shifting more toward software and user experience differentiation.When do you anticipate mass rollout of fully autonomous vehicles,and what are the main challenges to overcome?Mark Patrick:The rollout of fully autonomous vehicles is already happen
109、ing to some extent,particularly at level two,where we see vehicles with various assistance features.To achieve higher levels of autonomy,there are technical,ethical,and social chal-lenges to address.Technically,the necessary processing power and machine learning algorithms are increasingly available
110、.Social acceptance of driverless cars and ethical considerations,such as decision-making in complex situations,remain areas of concern.While trials of fully autonomous vehicles are ongoing,predicting when they will be-come mainstream is challenging.However,we may see more of these services in cities
111、 across the world in the next five years.Before fully autonomous vehicles become commonplace on the road,do you expect them to be used in controlled environments,like ports,airports,or factories?Mark Patrick:Yes,we are already witnessing the use of autonomous technology in controlled environments,su
112、ch as autonomous ground vehicles and robots.In these set-tings,the technology is more readily accepted and deployed.The same principles can be applied to larger-scale deployments in defined geographical areas with established infrastructure.Controlled environments with specific infrastructure and li
113、mited inter-action with the general public are more suitable for early adoption.Industrial facilities and warehouses are already leveraging autonomous technology for efficiency.Is there anything else about Mouser or your role that you would like to mention for the report?Mark Patrick:Our primary rol
114、e at Mouser is to enable access to technology.We work closely with our suppliers to ensure that those working on advanced systems have access to the technology they need,whether it is high-end processing power,sensing tech-nologies,or a wide range of components.We offer a comprehensive range of prod
115、ucts that can be used to build end-to-end systems.Essentially,we aim to provide everything customers need to develop their projects,from individual components to complete kits,making the process of accessing technology as straightforward as possible.2021LiDAR LiDAR(Light Detection And Ranging)sensor
116、s help autonomous vehicles to sense and understand their surround-ings.They use laser pulses to detect and measure the time it takes for the reflected light to return,compiling this data to create 3D mappings of its environment.This information is then combined with other data to ensure safe navigat
117、ion.A core area of current LiDAR research is developing systems that combine the strengths of different LiDAR tech-nologies to improve overall perception performance.Pairing pulsed LiDAR with FMCW LiDAR,for instance,pro-vides comprehensive object detection,accurate distance measurement,and real-time
118、 velocity estimation.A hybrid LiDAR setup could integrate a solid-state laser for short-distance assessments alongside an FMCW laser optimized for capturing distant measurements.Integrating LiDAR with other sensors like cameras and RADAR creates a sensor fusion ecosystem that can address sensor redu
119、ndancies and data gaps,ultimately improving the robustness and reliability of autono-mous driving systems.11LiDAR Product OverviewSolid-state LiDAR Solid-state LiDAR systems use non-moving optical components to steer laser beams,making them well-suited for the stringent require-ments of AVs.12 11 La
120、unched in 2018,solid-state LiDAR can enhance sensor range by more than 200 meters while reducing costs by more than ten times.They offer a promising advantage over conventional LiDAR that steers an optical beam using moving parts.The assembly and alignment of these moving parts are expensive and rai
121、se significant concerns about their long-term dependability.The demand for solid-state LiDAR is expected to grow at a CAGR of 30.66%over the forecast period of 2021-26.This potential growth is reflected in the high volume of research in this area,including the emerging area of nanophotonics-based Li
122、DAR sensors.Automotive brands like Velodyne(now Velodyne+Ouster)and tech compa-nies like Luminar&Xenomatix are advancing solid-state LiDAR research.With OEMs like Mercedes Benz enter-ing deeper partnerships in the solid state LiDAR space.Frequency-Modulated Continuous Wave(FMCW)LiDARFrequency-Modula
123、ted Continuous Wave(FMCW)LiDAR works by emitting a continuous laser signal with a mod-ulated frequency,which enables simul-taneous distance and velocity meas-urements.11 This real-time capability is crucial for AVs to accurately assess dynamic environments.FMCW LiDARs continuous waveform provides hi
124、gher resolution,enabling fine-grained ob-ject detection and tracking.Although signal processing complexi-ties exist,research in this field rapidly advances,promising improved percep-tion for AVs.It has been recognized as a transformative advancement in Li-DAR technology.Pioneering companies like Aev
125、a,Mobileye,and Blickfeld have dedicated extensive years to develop-ing Photonic Integrated Circuits(PICs)and FMCW sensors,poised to revolu-tionize the landscape of autonomous driving.13 14Companies Developing LiDAR Technologies for AVsIn this section,we go deeper into the companies at the forefront
126、of advanc-ing LiDAR technology for AVs.Velodyne Velodyne is a prominent provider of LiDAR sensors developed for AVs.It is the first LiDAR company to go public.The company asserts itself in the au-tomotive industry by working closely with customers to test its its LiDAR sensors based on common sets o
127、f re-al-world scenarios and relevant corner cases.In February 2023,Velodyne merged with Ouster.Major players in the AV industry,such as Waymo,Uber,and Cruise,utilize Velodynes LiDAR sensors.15Luminar TechnologiesLuminar Technologies develops vision-based LiDAR and machine perception technologies,pri
128、marily for autonomous vehicles.In February 2023,Luminar,launched Iris Plus,a LiDAR sensor designed to blend into the roofline of a production vehicle.It uses laser light waves longer than usual,at 1550 instead of the common 905 nanometers.This feature improves the devices ability to detect small and
129、 low-reflective objects,including dark-colored cars,animals,or a child suddenly running into the street.It operates at distances exceeding 250 meters and up to 500 meters for larg-er,more reflective objects.Mercedes plans to be among the first car manufacturers to incorporate Lu-minars Iris Pus LiDA
130、R into its produc-tion vehicles.Mercedes and Luminar announced their partnership in Janu-ary 2022,initially aiming to integrate Luminars LiDAR into a single high-end vehicle model.Since then,plans have expanded significantly,with Mercedes aiming to increase its LiDAR supply by ten times over the com
131、ing years.Big-name companies like Volvo,Toyota,and BMW also employ Lumi-nars sensors.16,17Aeva TechnologiesAeva Technologies pioneers LiDAR sensors with capabilities in both visible and infrared spectrums.Uber and Continental are among the com-panies adopting Aevas technology.In 2022,Aeva released i
132、ts revolutionary 4D LiDAR technology Aeries II,which employs FMCW4D technology and the LiDAR-on-chip silicon photonics de-sign.Aeries II is compact,configurable,and automotive-grade,designed for reliability across various conditions.With ultra-long-range object detec-tion and tracking capabilities o
133、f up to 500 meters,it stands out in detecting oncoming vehicles,pedestrians,and animals.Additionally,Aevas FMCW technology remains unaffected by interference from sunlight or other LiDAR sensors,and its LiDAR-on-chip design enables scalable production for a wide range of autonomous applica-tions.18Q
134、uanergy SystemsSince 2022,Quanergy is transforming physical security,which plays a crucial role in enhancing situational aware-ness and safety in driving,with its re-al-time 3-D LiDAR solutions.The com-pany is pioneering in providing 3-D LiDAR security solutions that bring intelligent and proactive
135、awareness to dynamic environments.Quanergy aims to empower users to transcend current sensing limitations,offering an experi-LiDAR light pulses covering object on the road.Image credit:Delphi2223ence of 3-D security tailored for a 3-D world.Toyota and Geely are among the companies incorporating Quan
136、ergys sensors into their AV products.19 20Intel and MobileyeSince 2020,Intel and Mobileye have a specific focus on enhancing the per-formance of LiDAR and RADAR sensors for AVs by leveraging technologies such as PICs and FMCW LiDARs.They are focussing on hybrid LiDAR-RA-DAR solutions aiming to capit
137、alize on the strengths of both technologies.The proposed architecture involves the integration of cameras,LiDARs,and RADAR to cover the full field of view,aiming to overcome challenges like side lobes and limited range in traditional sensors.21 22 The collabo-ration between the two companies aims to
138、 make Radars and LiDARs both better and cheaper in order to reach L5 autonomy more quickly.Their new product range is expected to launch in 2025.ContinentalContinentals High-Resolution 3D Flash LiDAR technology marks a sig-nificant advancement in vehicle vision.Released in 2021,this LiDAR system boa
139、sts a solid-state design,ensuring continuous data flow without gaps.Its high-resolution capabilities span both vertical and horizontal dimensions,offering detailed insights.The system also includes features like blockage detection,an integrated heater,an optional washing system,auto-align-ment,and c
140、ontinuous sampling mode.23 24Blickfeld Blickfeld introduced the Qb2 smart LiDAR sensor in 2022,a novel device designed for easy deployment due to its onboard processing and Wi-Fi connectivity.This marks the first smart LiDAR sensor featuring built-in software.The Qb2 LiDAR sensor merges high-perform
141、ance detection and ranging capabilities with onboard software,enhancing performance and setup efficiency without any complex custom software to be developed.Additionally,the sensor includes built-in Wi-Fi support.The Qb2 employs a custom micro-electro-mechanical systems(MEMs)mirror for beam steer-in
142、g,optimizing the balance between resolution,range,and field of view to create multi-dimensional maps.Achieve a maximum of 400 scan lines per frame,guaranteeing exceptional quality in-point cloud data.The Qb2 sensor is designed to accommodate three returns and boasts a laser beam divergence of 0.25 x
143、 0.25,facilitat-ing meticulous scanning for precise,dependable,and reliable informa-tion.25 14 26Hesai TechnologyHesai Technology offers a variety of LiDAR sensors designed to meet the requirements for Level 4 and higher autonomous driving,which ensures reliable and safe operation.On August 1,2023,H
144、esai Technology announced its partnership with NVIDIA.This collaboration aims to integrate Hesais advanced LiDAR sensors into the NVIDIA DRIVE and NVIDIA Om-niverse platforms,setting the stage for groundbreaking developments in autonomous driving.By bringing together Hesais specialized LiDAR technol
145、ogy and NVIDIAs expertise in AI,simulations,and software devel-opment,this partnership promises to drive innovation in the AV sector.RoboSenseRoboSense offers various Smart LiDAR perception system solutions based on three fundamental technologies:chips,LiDAR hardware,and perception software.In 2016,
146、RoboSense began working on mechanical LiDAR,known as the R platform.By 2017,they had introduced perception software and the M platform.In 2021,RoboSense achieved the start of production for the M1,becoming the first LiDAR company globally to mass-produce au-tomotive-grade LiDAR with internally devel
147、oped chips.In 2022,to improve the M platform product range in the automotive LiDAR field,RoboSense introduced the E platform,a blind spot solid-state LiDAR.OEMs that imple-ment RoboSense solutions are BYD,GAC MOTOR,SAIC Motor,Geely,FAW,Toyota,Baic Group,and many others.Continentals High-Resolution 3
148、D Flash LiDAR.Image credit:Continental2425SPONSOR INTERVIEW Theresa Hackl,Application Marketing Engineer at MurataKomei Takura,Senior Business Development Manager for Mobility Yoichi Murakami,Senior Product Manager for Function DevicesBuilding Trust in Autonomous Driving:Navigating Future Reliabilit
149、y and Milestone AchievementsIn your role at Murata and from the viewpoint of a component supplier,can you provide an overview of the current state of the autonomous vehicle industry?Theresa Hackl:Theres a lot going on in the autonomous vehicle industry these days;lots of development and testing.Many
150、 manufacturers(OEMs)and many tech compa-nies are involved.Recently,there were also many news about OEMs collaborating with IC companies and even Tier 1s.Focusing on whats happening on the road today,you can see Level 2(partial automation)or Level 2+(L2 with enhanced ADAS)cars,addi-tionally to the“st
151、andard”cars with no(Level 0)or just simple ADAS functions(Level 1).There are also instances where Level 3(conditional automation)has been granted.For example,the Mercedes S-Class has been granted L3 autonomous driving in Germany end of 2021(with market release in spring 2022)and since early this yea
152、r also in the US in some regions,under certain conditions.This still faces some challenges,but maybe we can discuss them later.As for Level 4(high automation)or more highly autonomous driving,these would be found in robotaxis,operating now for example in San Francisco,California.These driverless car
153、s have been allowed to operate 24/7.However,just recently,it was announced that there will be a limit on the number of driverless cars allowed to op-erate in San Francisco due to an accident,where an autonomous vehicle collided with a fire truck.As a result,they reduced the number of autonomous vehi
154、cles allowed to operate to fifty per day(and 150 during nighttime).Basically,you can see there are still some limitations to using autonomous vehicles and operating them.Komei Takura:Id like to add something here.In terms of the mindset,you can see quite a difference between Europe,the US,and China.
155、I joined a conference in the US a month ago.The feeling was that Level 4 or Level 5 cars would be on the road in about two years.The acceptance and belief in this is quite amazing compared to the mindset here in Europe.In China,for sure,they want to go even further.I mean,they want to take initiativ
156、es to be world leaders technology-wise.The willingness to develop faster is quite different from what you see in Europe,which is quite interesting.How do you see Muratas current product portfolio integrating with the evolving needs of autonomous vehicles?Theresa Hackl:Well,as a component manufacture
157、r,Murata can be found in various sen-sors like cameras,LiDAR,RADAR,etcand also in ECUs-the brains of the vehicle.For example,for a Level 3 car equipped with all the sensors and technologies needed for safe driving,Murata could provide up to 8,000 components,including passive compo-nents like capacit
158、ors,inductors,and thermistors.These cover just the ADAS functions.As for electrification,an electrified car or a connected car would,of course,have many more components on top,like our Bluetooth and Wi-Fi modules.Komei Takura:The number of components has been really,really increasing quite a lot.At
159、first,there used to be about 1,000 components or so.The combustion engine didnt have an ADAS system.Our main business for automotive is IVI(in-vehicle infotain-ment),but the number of components has been increasing rapidly from about 1,000 to an expected 15,000 or even 20,000 components per car for
160、EVs with ADAS functions in the next 3-4 years.Thats a significant change.What are the main challenges that you feel still exist in achieving fully autonomous vehicles,and how is Murata working to address those challenges from your perspec-tive?Theresa Hackl:There are many challenges.One is,of course
161、,regulations.Each region and country has its own regulations.In the US,its even more fragmented than here in Europe,where,I believe,it seems quite good,as we have particular regulations that allow autonomous driving under certain conditions.But there are also technical challenges to consider.You nee
162、d to have redundancy and fail-safe operation of sensors as they work together.There are infrastructure needs,where you would need to ensure the communication between the sensors and also between vehicles.With that,safety concerns arise.Generally,you need to ensure that the sensors are working well u
163、nder all conditions,no matter what the weather is or what may happen during operation.This also leads to other issues to consider,such as ethical concerns,legal issues,and consumer acceptance.Basically,there are a lot of challenges to be ad-dressed,not to mention that people also need to feel safe w
164、hile using an autonomous vehicle.Of course,each manufacturer has to address such challenges carefully.As for Muratas involvement,we are mainly component-oriented,but we also commu-nicate and collaborate with a lot of partners and industry players.We keep aware of what they need and we also contribut
165、e to the safety concerns.In the end,its more on 2627the side of the vehicle manufacturers and tech companies to ensure and spread the acceptance of safe autonomous driving.Furthermore,Murata also contributes to the trend in the automotive industry towards size and weight reduction by downsizing the
166、sensors or ECUs.In fact,about 90%of au-tonomous miles in California are already supported by Muratas inertial measurement units(IMUs).Komei Takura:There are many tests for autonomous driving over there in California,like what Waymo and some others are doing.They of course need high accuracy and high
167、 performance.Muratas IMU is a dead-reckoning sensor,and customers need such a high-accuracy solution,especially to ensure safety in this market.When the market de-velops in a later stage,the entire system may mature,and thus the number of sensors could be reduced a bit,but accuracy remains the key,a
168、nyway.Thats why we believe that companies will keep selecting our solutions.How important is the role of partnerships and collaborations in the evolution of auton-omous vehicles?Can you share some insights into the kind of partnerships Murata is pursuing?Theresa Hackl:As I mentioned earlier,we read
169、in the news all the time that Qualcomm is collaborating with this OEM or with that OEM.Tier 1s are working with this OEM and providing this and that.Partnerships and collaborations play a crucial role here in evolving autonomous vehicles and combining the strengths of each party.Murata is,of course,
170、in contact with all of these stakeholders to have a better outline of the ecosys-tem,whats going on in the market,and to also be able to provide the best solutions.One collaboration,for example,is with system integrators like Nordic Inertial.Yoichi Murakami:With regards to Nordic Inertial,we investe
171、d in this company because they focus on the algorithm along with how to use our sensor inside of a vehicle.For future autonomous driving,algorithm understanding is one of the key aspects,and this is more easily achieved using our high-performance sensor.This is just the start.In the future,wed like
172、to do such collaborations with other players in the market in order to establish the value of the Murata sensor in the market.Komei Takura:Speaking about the general autonomous driving market,collaborations aimed at software development are really important for our customers,especially OEMs.This is
173、quite common,but for Murata specifically,we collaborate with software companies and system integrators to be an integral part of the ecosystem of autono-mous vehicle customers.How is Murata staying ahead of the curve in predicting and adapting to changing requirements,and what R&D initiatives are cu
174、rrently in place to ensure the company remains a leading player in the mobility sector?Theresa Hackl:Our most important philosophy is to provide high-quality products.We believe thats one of the main reasons Murata is chosen and what we are well-known for.Of course,we will continue to keep developin
175、g these cutting-edge products with high quality while also adapting to the market needs and trends.In the future,we would also like to go more into the solution business by working together with OEMs,IC design houses,and system integrators.Through our module and mobile phone business,we already have
176、 a good relationship with IC makers,so we can build on that.Can you share some insights about Muratas roadmap for the next 5 years in the auton-omous vehicle sector?Theresa Hackl:Of course,we will continue to go along with the downsizing trend,but we also want to focus more on the application itself
177、 and not just on a single product.We aim to become a solution provider,specifically in terms of the sensor portfolio,and then be able to provide our customers with a concrete solution instead of just a single capacitor,inductor,etc.Komei Takura:Yes,speaking of sensors,we were an element supplier for
178、 ultrasonic sensors.We can be a module supplier.With algorithm,we can also become part of a system supplier.In the coming years,we want to be more integrated with the module,with algorithm,as a solution provider.Thats one of the sensor trends for the upcoming years.As the automotive industry pushes
179、for the democratization of autonomous features across all vehicle tiers,how is Murata working to make sensors more cost-effective for mass-market adoption?Yoichi Murakami:A little bit back to the basic things.Murata is a unique company.Product manufacturing is key.At Murata,we distinctly focus on ma
180、nufacturing pro-cesses on our premises.Why?Simply,to make a high-quality product,we need to fully understand material and how to manufacture the product,and the best way to do so is to have the manufacturing site on company premises in order to have complete clarity of the manufacturing process.That
181、 is a key philosophy of Muratas.Murata sticks to manufacturing.That is why its called Murata Manufacturing.While many companies create their designs and miniaturized products without fully understanding the key parameter of manufacturing,Murata provides the manufacturing on its premises.This is one
182、of our major advantages.Murata enables miniaturization with high-quality manufacturing of high-performance components.Even though the market requires cost reduction,quality remains the number-one priority,and Murata proudly provides that.2829One of the key challenges for sensors in autonomous vehicl
183、es is operation in adverse weather conditions like fog,rain,or snow.How is Murata addressing these challenges to ensure consistent and reliable sensor performance?Theresa Hackl:To address the challenge for sensors to perform in all weather condi-tions,we are currently developing an ultrasonic cleani
184、ng device that keeps devices clean and reliable.The main focus for this product is the camera application,but it could also be adapted in the future to other systems,like LiDAR,for example.With the vast amounts of data generated by autonomous vehicle sensors,how is Mura-ta approaching on-chip or nea
185、r-sensor processing?Komei Takura:Well,while were not a system supplier,maybe I can answer you from our passive-component-supplier viewpoint.One way we are contributing to data processing quality and speed is with our noise filters,which help reduce noise and smoothen the data processing.Weve been wo
186、rking with standard bodies to apply these products to areas like Controller Area Networks(CANs)and ethernet.Based on the corresponding requirements,we collaborate with OEM customers and make sure that even small components they need to qualify can be used properly,as these will contribute to better
187、their gateway performance.Thats one example.Of course,talking about ADAS ECU,and high-speed processing,as Theresa already explained,weve been working with known semiconductor companies.These companies need thousands of components for their chipsets.We support their design activity as a passive compo
188、nent supplier.What can people expect in terms of the sustainability and longevity aspects of your products?What is your approach to sustainability in your product portfolio?Yoichi Murakami:Of course,there are basic things.In Murata,we comply with the quality requirements and standards,like ISO and I
189、ATF.We also conduct the very-se-vere-condition testing inside Murata because reliability and quality are key in a Murata product.One key idea to keep in mind,especially in new areas like autonomous driving,is that no one knows exactly what kind of reliability requirements are going to be in the futu
190、re.For that reason,we need to think ahead a little more and base that on the current situation.The most important point is to establish trust with vehicle ven-dors and customers who will be using autonomous vehicles in the future.As Theresa mentioned earlier,90%of autonomous driving mileage in the U
191、S was realized with the Murata IMU sensor,-a good indication of our product reliability.Komei Takura:Just to iterate on the trust and quality aspect.As many people know,from time to time there are recalls and problems of car models.In order to ensure our customers parts selection,we are supporting O
192、EMs and Tier1s by providing all the reviews of their BOMs and the right components from both the quality viewpoint and the long lifetime viewpoint.Otherwise,they may not pick the right parts,which can be an issue for the automotive sector,where its not easy to switch components in the middle of a ma
193、ss production period.Such kind of support is also contributing to the quality on a system level,which would also help on the market level to ensure trust by the consumers.Visit Muratas website for more information about their product portfolio and their leading contribution as a component supplier t
194、o the evolution of AV technologies.Note:Since this interview Cruise has had its licence revoked following an accident.Autonomous vehicles navigate the road of innovation,driven by the promise of a safer,more efficient,and greener interconnected future.Theresa Hackl3031RF-Generator-3dBPower DividerTr
195、ansmitting AntennaReceiving AntennaPre-AmplifierMixer StageFilter(Low Pass)Amplifier(Base Band)Analog toDigital Converter(to the interfaceof a computer)ADRADAR In advanced driver-assistance sys-tems,a combination of radar types is utilized for optimal performance.Long-range radar(LRR)excels in de-te
196、cting objects up to 250 meters away.Medium-range radar(MRR)functions effectively within a 1-60 meter radius,while short-range radar(SRR)operates best from 1-30 meters,aiding in tasks like blind-spot detection and parking assistance.Radar sensors are typically positioned on each side of a vehicle,enc
197、ompassing the front,back,and sides.RADAR in autonomous vehicles operates at the frequencies of 24,74,77,and 79 GHz.Two primary radar types are prevalent in these systems;impulse RADAR and Frequency-modulated continu-ous wave(FMCW)RADAR.In impulse RADAR,one pulse is emitted from the device and the fr
198、equency of the signal remains constant throughout the operation.In FMCW RADAR pulses are emitted continually.Research and development in the last three years has pushed to solve many of the challenges in how autonomous vehicles navigate,interact,and adapt to ever-changing environments.High-lights of
199、 this research are outlined below.Solid-State RADARSolid-State RADAR sensors employ electronically controlled components to eliminate the need for moving parts.This advancement contributes to higher reliability,durability,and longevity of RADAR sensors,making them suitable for the demanding operatio
200、nal conditions of AVs.Sol-id-State RADARs are also more com-pact,enabling easier integration into AV designs.Furthermore,their lower power consumption and reduced heat generation are crucial for maintaining energy efficiency in AVs.27 This tech-nology is being actively researched and implemented by
201、companies such as Continental,Bosch,and Veoneer Structure and physics of a RADAR.Image credit:BabakShah/Wevolverfor applications in AVs.The shift to Solid-State RADAR signifies a move towards more robust and affordable sensing solutions in the evolving land-scape of autonomous driving.4D RADAR4D RAD
202、AR sensors build upon FMCW technology,incorporating time as the fourth dimension.This temporal information enhances the AVs ability to predict the trajectory of moving objects,providing a more comprehen-sive understanding of the surrounding environment.28 AV companies like Waymo,Aurora,and Argo AI a
203、re ex-ploring 4D RADAR sensors to enhance perception in autonomous vehicles.It is good to highlight that the impor-tance of these sensors can vary based on the overall sensor fusion strategy employed by developers.Synthetic Aperture RADAR(SAR)Synthetic Aperture RADAR(SAR)rep-resents an advanced RADA
204、R technique that offers high-resolution imaging capabilities for RADAR sensors.It enables AVs to better perceive and analyze objects,obstacles,and terrain,even in challenging weather condi-tions or low visibility scenarios.SAR generates detailed images by syn-thesizing multiple RADAR measure-ments t
205、aken from different positions as the vehicle moves.This approach creates a large virtual antenna,result-ing in finer resolution and improved object recognition.SAR is particularly valuable for identifying small objects,distinguishing between pedestrians and stationary obstacles,and enhanc-ing AVs pe
206、rception in complex scenari-os.Using sensor movement,it achieves precise angular resolution by creat-ing a substantial antenna aperture.Given the sensor locations,consec-utive RADAR measurements may be processed as if a single large antenna array acquired them.The figure below illustrates this princ
207、iple.29Recent research by Cambridge,Volkswagen and the German Insti-tute of Institute of Microwaves and Photonics have confirmed the idea that SAR imaging can be successfully and routinely used for high-resolution mapping of urban environments in the near future.Imaging radarsImaging radar represent
208、s a specific RADAR variant capable of constructing 2D or 3D depictions of the neighbor-ing surroundings.Between 2020 and 2023,significant advancements have been made in imaging radar technol-ogy,resulting in increased efficiency,improved capabilities,and expanded applications.First,there has been a
209、substantial enhancement in resolution and imaging precision in modern imaging radars.This development enables the detection of smaller objects and finer environmental details,significantly bolstering safety by improving the identification of pedestrians,cyclists,and obstacles.Additionally,imaging ra
210、dars have expanded their capabilities by incor-porating multi-mode functionality,including weather-penetrating RADAR Car position at t0 0Car position at t1 1Synthetic apertureIllustration of a synthetic aperture created from consecutive measurements of a moving RADAR.Image credit:3233Current RadarsS
211、hort-range radarMid-range radarLong-range radar200m4D Imaging Radar300mmodes.These modes enable the RADAR to operate effectively even in challenging weather conditions such as heavy rain,snow,or fog.Furthermore,imaging radars are increasingly integrated with comple-mentary sensors like LiDAR,cameras
212、,and ultrasonic sensors to enhance perception accuracy.This sensor fusion approach facilitates a comprehensive understanding of the surrounding environment and offers redundancy during sensor failures.Finally,imaging radars have benefited from advancements in signal pro-cessing algorithms,which now
213、enable them to filter out noise,distinguish between various object types,and pre-dict the behavior of detected entities.These advancements contribute signif-icantly to improved decision-making by the autonomous vehicles control system,enhancing overall safety and performance.4D RADARWhile traditiona
214、l imaging radar sys-tems construct 2D or 3D depictions of the surroundings,4D imaging radars utilize echolocation and the time-of-flight principle to create a 3D rep-resentation of the surroundings,with time as the fourth dimension.This technique also provides information about the speed of approach
215、ing or re-treating vehicles.These RADARs have successfully addressed the primary resolution challenge that convention-al RADARs face their resolution is significantly lower than cameras and LiDARs.4D imaging radars excel at detecting objects both vertically and horizon-tally,enabling high-resolution
216、 object classification.This advancement enhances the RADAR systems ability to determine the vehicles location independently.4D imaging radars are not yet a standard in widespread use across all OEMs but it is a promis-ing tendency.The adoption of radar technologies varies among automotive manufactur
217、ers,which we touch on later in the Tech Stack chapter.Imaging radar can differentiate between cars,pedestrians and other objects.Image credit:NXPMillimeter Wave RADARs Research from both the US and Japan group indicates that the millimeter wave RADAR has significant potential for AVs beyond its curr
218、ent use in park-ing assist.Millimeter-wave radar offers a cost-effective alternative to LiDAR,cameras,and optical sensors,primarily because its composition is limited to an integrated circuit(IC)and printed antennas,reducing its overall ex-pense.Additionally,this type of radar demonstrates superior
219、performance in challenging weather conditions like fog and rain,where traditional camera systems might falter.It also excels in detecting non-line-of-sight targets,such as those on curved road sections,making it a more reliable option in complex driving scenarios.30 31 Conti-nental,ZF,Bosch,Hella,Ap
220、tiv,Denso,Nidec Elesys,Valeo,Veoneer,and Hitachi are all developing Millimeter Wave RADARs for use in high level autonomy vehicles.Companies Developing RADAR Technologies for AVsBelow we outline companies leading the charge in the development of cutting-edge RADAR technologies tailored specifically
221、for autonomous vehicles.NVIDIA NVRadarNetNVIDIA NVRadarNet enhances tra-ditional RADAR processing methods for object detection by incorporat-ing a DNN approach.While classical RADAR processing can identify moving vehicles effectively,it struggles with stationary objects,often misclassifying them.The
222、 solution involved training a DNN using data from RADAR sensors to detect both moving and stationary objects and differentiate between various stationary obstacles.Comparison between current front imaging radars(coverage range from 18 to 80)and 4D imaging radars(100 coverage range).Image credit:Futu
223、re Bridge3435To address sparse RADAR data,ground truth labels were transferred from corresponding LiDAR datasets,allow-ing the DNN to learn not only object detection but also their 3D shapes,dimensions,and orientations.The integration of the RADAR DNN with classical RADAR processing improved obstacl
224、e perception,aiding AVs in making better driving decisions,even in complex scenarios,and offering redundancy to camera-based obstacle detection.32NavtechNavtech RADAR offers a robust sensor solution for AVs,ensuring performance in adverse conditions where other sensors might falter.The high-resolu-t
225、ion,360,long-range RADAR excels in adverse weather and environmental challenges,providing an extensive and accurate view of its surroundings.In 2021,this technology was chosen by rebro University as a key sensor for groundbreaking AV research with a special focus on the harshest of conditions for op
226、erating faultlessly in dust,dirt and when environmental visibility is low.This RADARs appli-cation extends to test routes and behavior analysis of both autonomous and regular vehicles,further solidify-ing its role in advancing autonomous technology.33NXP In January 2023,NXP released a new industry-f
227、irst 28nm RFCMOS radar“The DNNs,the deep neural networks,are becoming more and more complex.We have the ability to not just detect a pedestrian,but to detect a distracted pedestrian.”Danny Shapiro,VP of Automotive at NVIDIAExample of propagating bounding box labels for cars from the LiDAR data domai
228、n into the RADAR data domain.Image credit:NVIDIAIllustration of ZFs 4D imaging radar employed on SAICs R-series vehicle.Image credit:ZFone-chip IC family for next generation autonomous driving systems,enabling the long range detection of objects and separation of small objects next to larger ones.Th
229、is technology offers faster signal processing and allows for the implementation of 4D imaging radar capabilities in vehicles,par-ticularly for levels of automation like L2+and higher.These developments provide a cost-effective solution for original equipment manufacturers to integrate advanced RADAR
230、 systems into their vehicles.In addition to the RADAR processor and transceivers,NXP also offers essential peripherals,including safe power management and in-vehicle network components,to create a complete RADAR node system.34VayyarIn 2021,Vayyar,developed a pro-duction-ready RADAR-on-Chip(RoC)platf
231、orm.The platform offers a single multifunctional chip capable of re-placing multiple traditional one-func-tion sensors,reducing complexity for in-cabin and AV applications.The RoC features up to 48 transceivers,an internal DSP,and an MCU for real-time signal processing,providing all-weath-er effecti
232、veness and the ability to see through objects.This single-chip solution can replace over a dozen sensors,eliminating the need for expensive LiDAR and camer-as.Vayyars RoC offers a wide range of applications,from intruder alerts to enhanced seat belt reminders,catering to the increasing sensor densit
233、y in modern vehicles while delivering uncompromising safety.35,363637SPONSOR INTERVIEW Misha Govshteyn,the CEO of MacroFab,Brenden Duncombe,the Director of Customer EngineeringThe Role of PCBs in Shaping Autonomous Vehicle DevelopmentCan you both explain your roles and what MacroFab is?Misha Govshte
234、yn:Yeah,of course.My name is Misha and Im the CEO of MacroFab.Ive been here for about five years.Brenden Duncombe:My name is Brendan Duncombe.Im the Director of Customer Engineering here at MacroFab and Ive been here about six months.Misha Govshteyn:MacroFab is a digital platform for electronics man
235、ufacturing,and were powered by the worlds only factory marketplace.In most cases,companies con-tract with individual manufacturers.With MacroFab,it is very different.We are a platform that gives customers access to hundreds of production lines in multiple countries.So you can literally upload your d
236、esign to MacroFab and we will match you with the right factory.The best part is that MacroFab is responsible for every aspect of production.Youre working with us,and we deliver the product to you.That spans everything from prototype to production,so you dont have to switch facto-ries.You dont have t
237、o move from one supplier to another.We have customers moving from the earliest stages of prototyping to multi-million dollar orders,all on the same platform,working with MacroFab exclusively.So somebody like Brenden would be leading the charge with them.They may be working in different factories,mul
238、tiple factories,and in parallel,but they are always working with the same team.Thanks for the introduction.What is PCB prototyping?Brenden Duncombe:Yeah,I can start here.In PCB prototyping,electrical engineers or hardware designers often begin with dev kits on their desks or start with an idea.As th
239、ey move through the prototyping into the production process,at a certain point,theyll need to get their design actually on a fully integrated PCB for testing or design validation.And there will usually be many stages of that.Frequently,as you go through the process,you will learn things from the ear
240、ly ones.You may do one just for electrical design,then you will do another prototype where youre confirming that it fits in your mechanical enclosure,or you may have to produce some prototypes for RF testing.So,for each one of those stages,you will need to get a very low volume of PCBs made to do in
241、tegration,testing,and validation.Misha Govshteyn:The design process for all of the worlds products is now comput-erized.Some CAD products help you design mechanical parts,even for woodworking,right?There are digital products.So youre sitting in your computing environment.You can do 3D renderings of
242、things the same thing for electronics.So,a lot of the design process happens in peoples heads.It occurs in computing environments where you can do simulations.But at some point,the simulation is not enough.So youve got to take that virtual de-sign where you can see what your circuit board looks like
243、 and make assumptions about how it works.And you have to produce a functional prototype.So you can plug it into other auxiliary devices connected to other parts.Physical products require physical prototypes.So,usually,the design process is iter-ative.You design something,build a physical prototype,a
244、nd see how it works,but it usually works differently than you expect.So you have to build multiple iterations of it,and really,the faster you can go from the digital version of it to the physical version of it and iterate quickly,the more youre compressing time for design iterations.ngineering time
245、is costly,and this goes for every stage of production.Were talking about prototyping right now.Switching factories and waiting for things to happen in factories is the most expensive thing in the world.You change the design,and now you have to wait a long time for the factory to reflect that design;
246、that is an actual cost,and thats part of what MacroFab is compressing because everything happens on the same platform.It doesnt matter which factory you need.We have hundreds of production lines to prototype and eventually build a production.Whats unique about MacroFabs approach?3839Misha Govshteyn:
247、Were the first and only platform connecting customers to hundreds of production lines.Usually,all of this is people work.What happens when an engineer needs a factory?Either an engineer gets on a plane,or their supply chain gets on a plane,and starts traveling halfway around the world to find out wh
248、ich factories are good and which are bad.You cant tell when you walk into a factory.You can even hear people say things like this factory had the right smell.Thats usually a sign that they have yet to determine whether it is a good factory.Factories are data.Factories are output.Were the only way to
249、 aggregate many factories in one place and understand what they are good at building,what they are bad at building,and what kind of equipment they have.Can they even notionally build the right product,given the design parameters?Some factories have old equipment,and some factories have modern equipm
250、ent.Humans arent fast enough to understand all of this,but our software is much faster and does it algorithmically,so how we match up customers and factories is much faster than everybody else.And one big realization we made is that none of that works without humans at the end of the day.So we have
251、humans in a loop,and guys like Brenden actually do travel to factories.But Brenden knows precisely what hes looking for when he walks into one.So I think we take a lot of the heavy lifting from customers,irrespective of how difficult their job is or how complex the requirements.e had a very well-kno
252、wn automaker that at one point reached out to us and said,“Hey,I have this unusual PCB.It is a 30 by 5 form factor that doesnt fit into most factory machines.Can you build this?”Out of our hundreds of pro-duction lines,we had three that could create that particular board,which would have been a mont
253、hs-long exercise for a traditional supply chain team.With our software,that happens very quickly.Again,were the only business out there that operates this way.Usually,you work with factories individually,but most impor-tantly,it is not just a thing that matches up customers and factories.We are the
254、ones responsible for production.We are the ones producing it in this factory network.We have design engineers,we have manufacturing engineers,we have quality engineers,and ultimately we are the ones delivering products to the customer.It is the all-in model for churning and manufacturing into a clou
255、d-like service.Brenden Duncombe:Yeah,and I would say also one of the more unique aspects is that typically in this process,when you move from prototyping to production,most custom-ers are used to,working with a prototyping shop,and then they have to learn all the same lessons over again when they mo
256、ve to production.They have to get prototypes from the production house to make sure that they know how to build it correctly,even though they already have prototypes from their proto-typing house.And it is very unique that we handle both aspects of that.You can stay with us for any volume of PCB,and
257、 we move through your production lifecycle with you.Expanding on,obviously one of your USPs is that customers can go from prototype to production without sharing files with factories or people needing to find the right supplier.Could you explain that process in a little bit more detail and some of t
258、he technologyfor example,do you use a lot of AI with this)?Misha Govshteyn:Well,to be clear,our customers do share files with us,but we are one of the most secure platforms for doing so.My background is outside of manufacturing.I come from the cybersecurity world.Brenden comes from the electronics w
259、orld.So,data privacy and cyber security are the main domains.But at the end of the day,youre sharing your design files with one party.Thats MacroFab.We are extracting only the relevant information that the factories need to decide whether they can build it and sharing just that abstract with them.Th
260、ats in contrast with what usually happens in a supply chain world.Coming from a cybersecurity background,I know how blind we are to what happens in a supply chain universe.But in reality,no thought happens about what your partners need to see and what they dont.What you get many times is a multi-gig
261、abyte package of everything.You get giant design files.They blast this to every supplier for price discovery.Theyre just trying to figure out who can build this product and do it at the lowest possible price.The privacy impli-cations of that are immense.So we always get asked:“What is my risk?”What
262、youre doing now is incredibly risky.Youre sending files to all sorts of factories.Some of these factories may not even be real.With us,it is a very different story.You send it to MacroFab.Take that digital pack-age.Share only the relevant information with factories that need to see it.So,software de
263、termines who gets to see this information.We use many algorithms and a lot of machine learning to do that.But ultimately,it is not just algorithms.Many times,it is data classification and knowing who should see something.Brenden Duncombe:Commonly,we will see customers share data that is optional for
264、 quote.You will see firmware files.4041You will see things about their assembly.All of that,we strip out and only share the stuff thats required to manufacture the piece that theyre quoting.Misha Govshteyn:But the current supply chain and data privacy state is terrible.Right now,by definition,I was
265、blown away when I saw what we received from customers as quote requests.It is a massive amount of unnecessary information.As Brenden said,sometimes theyll package source code with it.There are better reasons to share your most intimate secrets with your suppliers than price discovery.And who are you
266、r customers and what ndustries do you work in?Misha Govshteyn:We are most dominant,I would say,in the industrial space.So thats probably our biggest segment.We have a lot of automotive companies that work with us,and we have done much work with autonomous trucking companies.And especially at the ear
267、liest stages of design,which is a high-tech,very iterative industry,our ability to turn around prototypes very quickly is important.Many times,these companies are tech startups.So the electronics team want to modulate how they work and how the software team works.Software teams these days use concep
268、ts like continuous deployment and rapid iter-ation.So they match their cadence.Because often,it is not just about building a PCB prototype;firmware gets burned onto it.So,the software team has to be in lockstep with it and vice versa.If your software team executes very quickly,but your hardware team
269、 is slowing them down,everything slows down.And were talking about some of the most expensive resources in the business slowing down across the board.We work with many startups,many drones,and many robotics companies.Oil and gas is a big field for us.Were in Texas,so thats natural.A lot of innovatio
270、n and digitization happens in oil and gas.We dont do many consumer electronics.I think of that as almost an entirely different industry.I think building one type of product for millions and millions of people is fun-damentally a different job than making something like an automotive product where ea
271、ch carhas,on average,something like 85 circuit boards.And I think that number is growing.Theres an immense amount of chips in cars.Theres a tremendous number of PCBs.Even mundane things like you turn on your blinkers.There are PCBs involved in that.Even traditional cars,much less autonomous cars.How
272、 important are PCBs in autonomous cars?Misha Govshteyn:Ill defer most of the answer to Brenden,but when you really think about what autonomous cars are built from,it is a lot of very high-powered computing units.Some of the automotive computing units are as powerful as crypto crunching devices and t
273、hey have many sensors.None of these things talk to each other without electronics.Obviously,PCBs are where you mount a lot of this infrastructure,so it is probably better for Brenden to explain it in more detail,but it simply doesnt work without PCBs.Brenden Duncombe:At the end of the day,nothing wo
274、rks without circuit boards con-necting it all together.As Misha said,the number of PCBs in cars is skyrocketing.The amount of information in cars is skyrocketing,and many autonomous vehicles have moved to higher bandwidth interconnects.Every car used to be a CAN bus,and now people are laying down au
275、tomotive ethernet and things like that in order to increase bandwidth in cars.And thats in large part due to the number of sensors streaming video from every corner of the car.LiDAR sensors,for instance,require sensor computing.Like mainframes in your car or processing in your car,whatever is doing
276、the decision-making,your AI computes modules.All of that is getting fed back in every single one,especially the sheer amount of distributed sensing on the car.All of that either requires the support circuitry on the sensor or computer in order to make that usable for decision-makingWhat does MacroFa
277、bs approach mean to the way you manufacture and design PCBs and the rollout of autonomous cars?Does it mean we can get to autonomous cars more quickly because of your process?Misha Govshteyn:I think for the traditional automakers,the conventional manufactur-ing approaches are fine.They move much slo
278、wer.When you think about traditional au-tomakers,controlled releases are really what theyre working against.And Ive worked with several people in software from the auto industry.Theyre usually frustrated by how slowly things iterate and change on cars.But every one of those automakers has crossed ov
279、er into the digital software-driven world in the last couple of years.Toyota is the largest,and it has a separate auto company that started specifically for that purpose.The same thing is happening with hardware teams as well.A lot of the production factories are still heavily controlled.A lot of th
280、e prototyping it does is actually happening very rapidly,and it needs a soft-ware-enabled,digitized approach to it.By the way,as a data point,how many compa-4243nies out there can receive and give you a price quote on your electronic design over a set of API calls?Theres only one,and thats us right
281、now.So were the only company out there thats truly software-enabled for electronics manufacturing.And that means that if automak-ers want to move faster,this platform is most aligned with that motion.Brenden Duncombe:I think it is very clear that now the software is moving a lot more quickly than th
282、e hardware is.To keep up with that,especially when it comes to the sensing capability and setting the compute capability(such asquicker turnaround times and getting that performance to validate your models against your machine learning models,your AI models),it is critical to evaluate the performanc
283、e of those.As those models get better and better,whether they can go with lower resolution sensors or find out they need higher resolution,all of that change to what is required to feed those models requires faster prototyping.Whats your opinion on when well save mass adoption and rollout of autonom
284、ous vehicles?Misha Govshteyn:Were certainly seeing a rollout now.Major rollout is happening in Texas,Arizona,and California,.As for now mass adoption;people have been wrong about that forecast for so many years.Im hesitant to put a number out there,but I think within five years or so.I actually dont
285、 think theres going to be a switch that makes everything autonomous.We are going to see transportation segments moved towards autonomous cars in a major way.So I think a certain portion of driving will be done by autonomous vehicles,probably about a third or so in the next five to seven years.Thats
286、my guess.Brenden Duncombe:Im also hesitant to make predictions on something that has been so famously incorrectly predicted before.Similarly,were seeing a lot of rollouts already.A lot of these have been in limited areas or with certain speed and streets and so on.As we move forward,Id like to see i
287、f there has been more discussion about the type of infrastructure to support autonomous cars.In addition,Id like to see if more adoption of better-connected infrastructure helps ease the adoption.And so,as we move into a world wherewere seeing the rollout before getting more comfortable,it is okay t
288、o make an investment in some infrastructure to help support this and make the adoption easier.That will certainly help speed things along.Obviously regulations change and technology improves all the time,but what are some of the other big challenges that will affect the rollout of autonomous vehicle
289、s in the future?Misha Govshteyn:It needs to be regulatory,first and foremost.Cruise just had to sus-pend its operations in Texas.You know,it is all related to technology.Meeting the real world is full of conditions that even the best software in the world cant necessarily predict.And sometimes that
290、means the irrationality of courts and law enforcement.So,in this case,Cruise didnt even cause the accident.It was a human driver that caused the accident.But Cruise was involved as this kind of secondary actor,and they still had to deal with the outcome.Im certainly not an expert in the evolution of
291、 automotive products,but autonomous vehicles are going through the same journey as when cars originally became dominant products.Eventually,regulators stepped in and started to slow things down.Thats probably the biggest variable.Ultimately,regulatory controls are the biggest thing standing in their
292、 way.Ironically,Im not necessarily down on regulatory controls.I think there is at least one area,for example,where they could be immensely helpful.For example,right now theres no regulation out there for where you send your intellectual propertyand how much of it is to send to which countries.So we
293、 treat other countries as just a place to get lower costs when we should treat other countries at the very least as competitors and,in some cases,adversaries.More regulatory controls in that domain would actually be a net positive.Right now cost is the thing that supply chains care about most.I thin
294、k in the future they will all move faster if they stay closer to home.Working with companies like MacroFab,they can match their speed requirements.From experience,often in order to do the most secure thing you have to be forced to do it by regulatory controls.So,I think regulation is obviously a dou
295、ble-edged sword.Brenden Duncombe:I think regulation is the main thing.I also think that when we start talking about mixed-use,it is easy to envision a world where levery cars autono-mous,and so they all work together just fine.But I think the public response is also part of it,right?utonomous cars a
296、re a massive change and they drive exactly the same way humans do it,so that will take some get-ting used to.I think theres a lot of human adoption needed with being on the road and your usage patterns,but also driving that adoption.ven if the regulators approve it,there can also 4445be a lot of pus
297、hback fromother drivers that could also come to issue.So it is both sides of the market.Other users of the same infrastructure,needto be prepared to share that.Misha Govshteyn:To extend what Brenden said,itll also follow the typical hype curve.Right now,theres a lot of excitement about it.Everybody
298、wants things to happen smoothly and very quickly.And thats almost never the way technologies get adopted.Weve mentioned it,but this is the point that we reiterate.Right now,the hardware world is the long pole attempt.It is one of the things that takes the longest.And perceived constraints by the sup
299、ply chain drive a lot of it.People throw up their hands and say,I dont really know how to build this any faster.I know the software is ready,and it is already very quick.But my hardware cannot be.The answer to that is it can be.It can be with MacroFab.A lot of it comes down to whether supply chain t
300、eams are able to move quicker,just as fast as software teams,and just as fast as hardware teams want to move.That is an executive change.Only a top-down message can really break through to that because until you change the requirements for supply chain teams and say speed is more important than cost
301、,there will always be this mismatch between how quickly the busi-ness wants to go and what the supply chain team is optimizing for.I know how these people get their bonuses,which is the most important thing in the world,and it is still not based on the speed of iteration,it is not based on how quick
302、ly they turn prototypes or anything else around,it is all about the bottom line at this point and ultimately there is a big mismatch between the expectations and reality of supply chain.Brenden Duncombe:Similarly,the software world has adopted CICD and continuous integration and continuous deploymen
303、t in order to tackle this fast iteration.It is very common now.Anyone who starts a project,most of the time it is a software project.The first thing you do is you set up your deployment chain,right?You have all of that built-in.Similarly,the electronic world and hardware world can keep up with that.
304、More engineers should feel comfortable iterating hardware more quickly,deploying the exact same technology that they use for software,hardware,and infrastructure for testing and getting away from this mindset.Theyre like,“well,we still have to support this legacy hardware forever.We made a mistake i
305、n the prototype,and we patch it with software”tech debt the years.That mindset needs to change a little bit in these areas,and industries that are moving more quickly and iterating can use MacroFab for that support.Misha Govshteyn:And supply chain is one of the blockers for that because even the eng
306、ineers that want to do that eventually get told that sounds great as long as it inte-grates with our ERP.That is maybe the most expensive requirement.With MacroFab when they want to move fast,we can enable that with our APIs.The supply chain team has to be part of that answer.You cant have an agile
307、enterprise and a traditional supply chain team.Those two things are incompatible.4647Ultrasonic Sensors An ultrasonic sensor is an electronic device that measures the distance of a target object by emitting ultrasonic sound waves,and converts the re-flected sound into an electrical signal.Within aut
308、onomous vehicles,they are most commonly employed to create Intelligent Parking Assist Systems(IPAS)which aid vehicles that in park-ing maneuvers by providing real-time distance and object detection infor-mation to the vehicles control system.From an innovation perspective,ul-trasound technology is n
309、ot known for frequent breakthroughs.Nevertheless,two recent technical solutions in the field of AVs deserve special attention.In 2023,MEMS Ultrasonic Sensor Solu-tion introduced an Intelligent Cabin Child Presence Detection system,crucial for child safety in vehicles.It utilizes various sensors to d
310、etect chil-dren inside a car and alerts the driver.The MEMS ultrasonic sensor module has compact dimensions,measuring 30 x 20 x 5mm,significantly smaller than both open ultrasonic and millim-eter-wave RADAR modules.This MEMS ultrasonic Child Presence Detection solution boasts a detection distance of
311、 over 1m and a field of view reaching 180(90),ensuring comprehensive coverage and precise monitoring for all cabin positions.No-tably,the latest Euro NCAP standards suggest that MEMS ultrasonic sensing could dominate Child Presence Detection systems due to its efficient vital sign detection,extensiv
312、e sensing range,compact size,and discreet installation.NCAP has now included Child Presence Detection in its testing criteria.Also,in 2023,Murata unveiled a new water-resistant ultrasonic sensor designed for self-driving cars,known as the MA48CF15-7N.This sensor is highly sensitive,responds quickly,
313、and is enclosed in a sealed case to protect it from liquids.As cars become more autonomous,the demand for pre-cise short to medium-range sensors to detect objects is growing.The MA48CF15-7N operates by emitting ultrasonic waves and measuring the time it takes for them to bounce back,determining the
314、presence and Structure of an ultrasonic sensor.Image credit:Medium-Babak Shahian Jahromi(Adapted by Wevolver)Standard targetUltrasonicControllerClock generatorProcessingOutputdistance of nearby objects.This sensor can detect objects as close as 15cm and as far away as 550cm,covering a wide area with
315、 a 120 by 60 angle.Notably,the sensors capacitance is 1100pF10%at 1kHz,ensuring con-sistent performance without the need for frequent adjustments.Operating at a resonant frequency of 48.21.0kHz and with a quality factor(Q value)of 3510,it delivers reliable perfor-mance across various temperatures.Th
316、ese specifications are notably more precise than previous models from Murata,with a 50%reduction in varia-bility,ensuring consistent performance across different units.Location of Continental ultrasonic parking sensor.Image credit:Continental4849INTERVIEW Nexperia Engineering TeamExploring the Futur
317、e of High-Performance Computing in Autonomous VehiclesThe automotive industry is grappling with the need to develop hardware that is not only reliable and efficient but also compact enough to fit within the confines of a ve-hicle.This challenge is amplified by the increasing demands of Advanced Driv
318、er-Assis-tance Systems(ADAS)in autonomous vehicles,which require immense computational resources to process data from an array of sensors and cameras in real-time.The shift towards centralized ADAS architectures marks a significant departure from traditional vehicle design.These systems resemble mid
319、-range server architectures,equipped with dedicated GPUs optimized for complex algorithms and self-learning capabilities.The processing power required for these systems is immense,necessitat-ing the use of multiple high-power microchips.This evolution raises critical questions about the reliability,
320、safety,and energy consumption of these computing units,espe-cially given their crucial role in autonomous driving.An intriguing aspect of this technological evolution is the continued relevance of dis-crete components in automotive systems.Despite advancements in integrated circuits,discrete compone
321、nts like transistors,MOSFETs,and diodes remain vital due to their flexibility,reliability,and cost-effectiveness.This persistence underscores the dynamic nature of automotive semiconductor technology and its critical role in shaping the future of autonomous vehicles.In this brief interview below we
322、heard from Nexperia engineers as they discuss how the increasing demand for computational power in vehicles is reshaping automotive design,the challenges in ensuring system safety and reliability,and the broader impact of these changes on the semiconductor industry.1.What are the most recent trends
323、in terms of incorporating new electronic devices into the vehicle and manufacturing processes?The automotive industry is undergoing a major transformation,with electronics play-ing an increasingly important role.Recent trends include electrification,connectivity,and autonomous driving.As the automot
324、ive industry shifts towards electric and hybrid vehicles,there is a growing demand for high power and wide-bandgap electronics solutions such as Silicon,SiC and GaN MOSFETs,IGBTs,diodes,and other semiconductor devices capable of efficiently managing and controlling electric power.Nexperia is at the
325、forefront of this trend,developing innovative semiconductor solutions specifically tailored for electric vehicle applications,enabling higher efficiency,increased power density,and improved thermal management in automotive electronics.2.How are vectors such as electrification,connectivity and autono
326、mous vehicle devel-opments influencing the role of electronics in automotive?Electrification,connectivity,and autonomous vehicle developments greatly influence the role of electronics in automotive applications.Electrified vehicles demand more semiconductor content,driving the need for advanced comp
327、onents.Connectivity re-quires seamless vehicle-infrastructure communication,while autonomous vehicles rely on complex sensor systems.All three vectors are interconnected,with electrified vehi-cles offering better options for advanced electronics.Components like camera systems,radar systems,and large
328、r displays rely on electronics.Overall,these vectors amplify the role of electronics in the automotive industry,necessitating advancements in electronic components and systems to support powertrain control,communication capabilities,and autonomous functionalities.3.How can electronics contribute to
329、areas such as sustainability and efficient energy management?Electronics play a vital role in enabling sustainability and efficient energy manage-ment.At Nexperia,our focus on developing better power semiconductors enables more efficient cars and applications.By minimizing power losses and enhancing
330、 power conversion efficiency,our semiconductors significantly contribute to lower energy con-sumption,reduced carbon emissions,and extended range in electric vehicles.Efficient power electronics also support renewable energy systems,smart grids,and energy-effi-cient industrial applications.Through o
331、ur semiconductor solutions,we strive to enable greener technologies,enhance energy efficiency,and drive the transition towards a more sustainable future.4.How is your company working on innovating new electronic solutions,either for the vehicle or for your manufacturing processes?Every new car today
332、 already has approximately 600 Nexperia devices,and while our products are very small,the combined effort can have an impact on the efficiency and performance.Thus,we are working on innovating new electronic solutions for both vehicles and manufacturing processes.And while we continuously innovate t
333、he“workhorse”silicon,we are also developing leading-edge wide bandgap devices.These silicon carbide(SiC)and gallium nitride(GaN)semiconductors offer higher efficien-cy and performance compared to traditional silicon-based devices.By incorporating 5051Thinking and LearningAutonomous cars employ advanced algorithms,machine learning,and artificial intelligence to think”and learn.”They gather data fro