《Wevolver:2020年自動駕駛汽車技術報告(英文版)(42頁).pdf》由會員分享,可在線閱讀,更多相關《Wevolver:2020年自動駕駛汽車技術報告(英文版)(42頁).pdf(42頁珍藏版)》請在三個皮匠報告上搜索。
1、2020 AUTONOMOUS VEHICLE TECHNOLOGY REPORT The guide to understanding the current state of the art in hardware develop- ing ultra high-resolution 4D imaging radar technology. MBA at Technion, the Israel Institute of Technology Designer Bureau Merkwaardig Amsterdam, The Netherlands Award winning desig
2、ners Anouk de lEcluse and Daphne de Vries are a creative duo based in Amsterdam. They are special- ized in visualizing the core of an artistic problem. Bureau Merkwaardig initiates, develops and designs. Illustrations Sabina Begovi Padua, Italy Croation born Sabina is a visual and inter- action desi
3、gner. She obtained a Masters in Visual and Communication Design at Iuav, University of Venice, and a Masters in Art Education at the Academy of Applied Art, Rijeka, Croatia. Cover Photographer Benedict Redgrove London, United Kingdom Benedict has a lifelong fascination with technology, engineering,
4、innovation and industry, and is a dedicated proponent of modernism. This has intuitively led him to capturing projects and objects at their most cutting edge. He has created an aes- thetic of photography that is clean, pure and devoid of any miscellaneous informa- tion, winning him acclaim and numer
5、ous awards. Redgrove has amassed a following and client base from some of the most ad- vanced companies in the world. A career spent recording the pioneering technol- ogy of human endeavours has produce a photographic art form that gives viewers a window into an often unseen world, such as Lockheed
6、Martin Skunk Works, UK MoD, European Space Agency, British Aerospace and NASA. Whether capturing the U-2 re- connaissance pilots and stealth planes, the Navy Bomb Disposal Division or spending time documenting the NASAs past, present and future, Benedict strives to capture the scope and scale of adv
7、ancements and what they mean to us as human beings. His many awards include the 2009 AOP Silver, DCMS Best of British Creatives, and the Creative Review Photography Annual 2003, 2008, and 2009. At Wevolver we are a great fan of Bene- dicts work and how his pictures capture a spirit of innovation. We
8、re grateful he has enabled us to use his beautiful images of the Robocar to form the perfect backdrop for this report. Many thanks to The people at Roborace, specifjcally Victo- ria Tomlinson and Alan Cocks. Edwin van de Merbel, Dirk Wittdorf, Petra Beekmans - van Zijll and all the other people at N
9、experia for their support. Our team at Wevolver; including Sander Arts, Benjamin Carothers, Seth Nuzum, Isidro Garcia, Jay Mapalad, and Richard Hulskes. Many thanks for the proofreads and feedback. The Wevolver community for their support, knowledge sharing, and for making us create this report. Man
10、y others that cant all be listed here have helped us in big or small ways. Thank you all. Beyond the people mentioned here we owe greatly to the researchers, engineers, writers, and many others who share their knowledge online. Find their input in the references. Media Partner Supplyframe Supplyfram
11、e is a network for electronics design and manufacturing. The company provides open access to the worlds largest collection of vertical search engines, supply chain tools, and online communi- ties for engineering. Their mission is to organize the world of engineering knowl- edge to help people build
12、better hardware products, and at Wevolver we support that aspiration and greatly appreciate that Sup- plyframe contributes to the distribution of this report among their network. 67 “Its been an enormously difficult, complicated slog, and its far more complicated and involved than we thought it woul
13、d be, but it is a huge deal.” Nathaniel Fairfield, distinguished software engineer and leader of the behavior team at Waymo, December 2019 1 89 Introduction Bram Geenen Editor in Chief, CEO of Wevolver Motorized transportation has changed the way we live. Autonomous vehicles are about to do so once
14、more. This evolution of our transport - from hors- es and carriages, to cars, to driverless vehicles, - has been driven by both technical innovation and socioeco- nomic factors. In this report we focus on the technological aspect. Looking at the state of autonomous vehicles at the start of the 2020s
15、 we can see that impressive milestones have been achieved, such as compa- nies like Waymo, Aptiv, and Yandex offering autonomous taxis in dedicat- ed areas since mid-2018. At the same time, technology developers have run into unforeseen challenges. Some industry leaders and experts have scaled back
16、their expectations, and others have spoken out against optimistic beliefs and predictions.2,3 Gartner, a global research and advi- sory fjrm, weighs in by now placing autonomous vehicles in the Trough of Disillusionment of their yearly Hype Cycle.4 The engineering community is less affected by media
17、 hype: Over 22% of the engineers visiting the Wevolver platform do so to gain more knowl- edge on autonomous vehicle technol- ogy.5 Despite how much topics like market size and startup valuations have been covered globally by the media, many engineers have ex- pressed to our team at Wevolver that co
18、mprehensive knowledge to grasp the current technical possibilities is still lacking. Therefore, this reports purpose is to enable you to be up to date and un- derstand autonomous vehicles from a technical viewpoint. We have compiled and centralized the information you need to understand what technol
19、ogies are needed to develop autonomous vehicles. We will elaborate on the engineering consid- erations that have been and will be made for the implementation of these technologies, and well discuss the current state of the art in the industry. This reports approach is to describe technologies at a h
20、igh level, to offer the baseline knowledge you need to acquire, and to use lots of references to help you dive deeper whenever needed. Most of the examples in the report will come from cars. However, indi- vidual personal transportation is not the only area in which Autonomous Vehicles (AVs) will be
21、 deployed and in which they will have a signifjcant impact. Other areas include public transportation, delivery if you dont solve vision, its not solved You can absolutely be superhuman with just cameras.” The main benefjts of passive sensors are17: High-resolution in pixels and color across the ful
22、l width of its fjeld of view. Constant frame-rate across the fjeld of view. Two cameras can generate a 3D stereoscopic view. Lack of transmitting source re- duces the likelihood of interfer- ence from another vehicle. Low cost due to matured tech- nology. The images generated by these systems are ea
23、sy for users to understand and interact with Indeed, Tesla cars mount an array of cameras all around the vehicle to gather visual fjeld information, and London based startup Wayve claims that its cars which only rely on pas- sive optic sensors are safe enough for use in cities. The main shortcoming
24、of passive sensors is their performance in low light or poor weather condi- tions; due to the fact that they do not have their own transmission source they cannot easily adapt to these conditions. These sensors also gen- erate 0.5-3.5 Gbps of data,18 which can be a lot for onboard processing or comm
25、unicating to the cloud. It is also more than the amount of data generated by active sensors. If a passive camera sensor suite is used on board an autonomous vehicle, it will likely need to see the whole surrounding of the car. This can be done by using a rotating camera that takes images at specifjc
26、 inter- vals, or by stitching the images of 4-6 cameras together through software. In addition, these sensors need a high dynamic range (the ability to image both highlights and dark shadows in a scene), of more than 100 dB,22 giving them the ability to work in various light conditions and distingui
27、sh be- tween various objects. Dynamic range is measured in decibel (dB); a logarithmic way of describing a ratio. Humans have a dynamic range of about 200 dB. That means that in a single scene, the human eye can per- ceive tones that are about 1,000,000 times darker than the brightest ones. Cameras
28、have a narrower dynamic range, though are getting better. The electromagnetic spectrum and its usage for perception sensors .16 2021 Signal in Signal out Distance measured “We need more time for the car to re- act, and we think imaging radar will be a key to that.” Chris Jacobs, Vice President of Au
29、tonomous Transporta- tion and Automotive Safety, Analog Devices Inc, January 2019 26 Active Sensors Active sensors have a signal transmis- sion source and rely on the principle environment. ToF measures the travel time of a signal from its source to a the signal to return. The frequency of the signa
30、l used de- termines the energy used by the sys- tem, as well as its accuracy. Therefore, determining the correct wavelength plays a key role in choosing which system to use. Ultrasonic sensors (also referred to as SONAR; SOund NAvigation Ranging) use ultrasound waves for ranging and are by far the o
31、ldest and lowest cost of these systems. As sound waves have the lowest frequency (longest wavelengths) among the sensors used, they are more easily disturbed. This means the sensor is easily affected by adverse environmental conditions like rain and dust. Inter- ference created by other sound waves
32、can affect the sensor performance as well and needs to be mitigated by using multiple sensors and relying on additional sensor types. In addition, as sound waves lose energy as distance increases, this sensor is only effective over short distances such as in park assistance. More recent versions rel
33、y on higher frequencies, to reduce the likelihood of interference.24 RADAR (RAdio Detection And Rang- ing) uses radio waves for ranging. Radio waves travel at the speed of light and have the lowest frequency (longest wavelength) of the electro- magnetic spectrum. RADAR signals - rials that have cons
34、iderable electrical conductivity, such as metallic objects. Interference from other radio waves can affect RADAR performance, while transmitted signals can easily bounce off curved surfaces, and thus the sensor can be blind to such objects. At the same time, using the bouncing properties of the radi
35、o waves can enable a RADAR sensor to see beyond objects in front of it. RADAR has less- er abilities in determining the shape of detected objects than LIDAR.25 - DAR are its maturity, low cost, and resilience against low light and bad weather conditions. However, radar can only detect objects with l
36、ow spatial resolution and without much information about the spatial shape of the object, thus distinguishing between multiple objects or separat- ing objects by direction of arrival can be hard. This has relegated radars to more of a supporting role in automo- tive sensor suites.17 Time of fl ight
37、principle, illustrated. Image: Wevolver. The distance can be calculated using the formula d=(vt)/2. Where d is the distance, v is the speed of the signal (the speed of sound for sound waves, and the speed of light for electromagnetic waves) and t is the time for the signal to go to reach the object
38、and refl ect back. This calculation method is the most common but has lim- itations and more complex methods have been developed; for example, using the phase-shift in a returning wave.23 2223 “Almost everything is in R where the receiver leverages signals from multiple GNSS systems. Furthermore, ac
39、curacy can be brought down to 1cm levels using additional technologies that augment the GNSS system. To identify the position of the car, all satellite navigation systems rely on the time of fmight of a signal between the receiver and a set of satellites. GNSS receivers triangulate their po- sition
40、using their calculated distance from at least four satellites.48 By con- tinuously sensing, the path of the ve- hicle is revealed. The heading of the vehicle can be determined using two GNSS antennas, by using dedicated onboard sensors such as a compass, or it can be calculated based on input from v
41、ision sensors.49 While accurate, GNSS systems are also affected by environmental fac- tors such as cloud cover and signal refmection. In addition, signals can be blocked by man-made objects such as tunnels or large structures. In some countries or regions, the signal might also be too weak to accura
42、tely geolo- cate the vehicle. To avoid geolocalization issues, an Inertial Measurement Unit (IMU) is integrated with the system.50,51 By using gyroscopes and accelerometers, such a unit can extrapolate the data available to estimate the new loca- tion of the vehicle when GNSS data is unavailable. In
43、 the absence of additional signals or onboard sensors, dead-reckoning may be used, where the cars naviga- tion system uses wheel circumference, speed, and steering direction data to calculate a position from occasion- ally received GPS data and the last known position.52 In a smart city environment,
44、 additional navigational aid can be provided by transponders that provide a signal to the car; by measuring its distance from two or more signals the vehicle can fjnd its location within the environment. Maps Today, map services such as Google Maps are widely used for navigation. However, autonomous
45、 vehicles will likely need a new class of high defj- nition (HD) maps that represent the world at up to two orders of magni- tude more detail. With an accuracy of a decimeter or less, HD maps increase the spatial and contextual awareness of autonomous vehicles and provide a source of redundancy for
46、their sensors. By triangulating the distance from known objects in a HD map, the precise localization of a vehicle can Amnon Shashua, Chief Technology Officer at Mobileye, 2017 55 A 3D HD map covering an intersection. Image: Here “If we want to have autonomous cars everywhere, we have to have digita
47、l maps everywhere.” 3233 “The need for dense 3-D maps limits the places where self-driving cars can operate.” Daniela Rus, director of MITs Computer Science and Artificial Intelli- gence Laboratory (CSAIL), 2018 be determined. Another benefjt is that the detailed information a high defjnition map co
48、ntains could narrow down the information that a vehicles perception system needs to acquire, and enable the sensors and software to dedicate more efforts towards moving objects.53 HD maps can represent lanes, geome- try, traffjc signs, the road surface, and the location of objects like trees. The in
49、formation in such a map is repre- sented in layers, with generally at least one of the layers containing 3D geometric information of the world in high detail to enable precise calcu- lations. Challenges lie in the large efforts to generate high defjnition maps and keep them up to date, as well as in the large amount of data storage and bandwidth it takes to store and trans- fer these maps.54 Most in the industry express HD maps to be a necessity for high levels of autonomy, in any case for the near future as they have to make up for limited abilities of