《日本汽車工業協會:2024自動駕駛安全性能評估框架3.0版-自動駕駛技術安全評估指南(英文版)(279頁).pdf》由會員分享,可在線閱讀,更多相關《日本汽車工業協會:2024自動駕駛安全性能評估框架3.0版-自動駕駛技術安全評估指南(英文版)(279頁).pdf(279頁珍藏版)》請在三個皮匠報告上搜索。
1、(C)Copyright Japan Automobile Manufacturers Association,Inc.,All rights reserved.Automated Driving Safety Evaluation Framework Ver 3.0 Japan Automobile Manufacturers Association,Inc.Sectional Committee of AD Safety Evaluation,Automated Driving Subcommittee December 2022 (C)Copyright Japan Automobile
2、 Manufacturers Association,Inc.,All rights reserved.List of committee members Chief of Sectional Committee:Hideaki Sato,Toyota Motors Deputy Chief of Sectional Committee,Safety argument WG Leader:Koichiro Ozawa,Honda Motor Co.,Ltd.Deputy Chief of Sectional Committee:Eiichi Kitahara,Nissan Motors Co.
3、,Ltd.Committee member,Virtual Evaluation of Perception WG Leader:Yumi Kubota,Nissan Motor Co.,Ltd Committee member:Kohji Ishiwata,Nissan Motor Co.,Ltd Committee member:Tomofumi Koishi,Honda Motor Co.,Ltd.Committee member:Shinji Narimatsu,Honda Motor Co.,Ltd.Committee member:Yoshiya Kubo,Mazda Motor
4、Corporation Committee member:Yusuke Yamada,Mazda Motor Corporation Committee member:Fumihiko Takegoshi,SUZUKI MOTOR CORPORATION Committee member:Shinji Tsunoda,SUBARU CORPORATION Committee member:Kenichi Yamada,Daihatsu Motor Co.,Ltd.Committee member:Masaru Idoguchi,Hino Motors Ltd.Committee member:
5、Atsushi Ohshiba,Hino Motors Ltd.Committee member:Shinichiro Kawano,Isuzu Motors Limited Committee member:Yasuhiro Furukawa,Mitsubishi Motors Corporation Committee member:Tomoya Yabuzaki,Yamaha Motor Co.,Ltd.Committee member:Tetsuya Ishida,UD Trucks Corporation Advisor:Hiroaki Nakata,Hitachi Astemo,L
6、td.Advisor:Koichi Terui,Hitachi Astemo,Ltd.Advisor:Tatsuhiko Monji,Hitachi Astemo,Ltd.Advisor:Yuko Murase,DENSO CORPORATION Advisor:Kenji Suganuma,DENSO CORPORATION Advisor:Shingo Jinno,DENSO CORPORATION Advisor:Masami Suzuki,Pioneer Corporation (C)Copyright Japan Automobile Manufacturers Associatio
7、n,Inc.,All rights reserved.iii Contents Main changes and additions to Ver.3 .1 1 1.Positioning of this Paper .1 1 2.Automated Driving System Safety Argumentation Structure .2 2 2.1.Issues with existing approaches.2 2 2.1.1.Safety evaluation through long-distance/long-duration driving tests .2 2 2.1.
8、2.Data storage/classification scenario-based approach .2 2 2.2.Overview of Physics Principles Approach Process .3 3 2.3.Safety Argumentation Structure Framework .5 5 2.3.1.Automated driving safety principles .5 5 2.3.2.Scope of safety evaluation .6 6 2.3.3.Method of evaluating safety .7 7 2.3.4.Safe
9、ty evaluation method for perception disturbance .2020 2.3.5.Safety evaluation method for vehicle disturbance.2222 3.Scenario-Based Safety Assurance Process .2525 3.1.Safety argumentation scheme(Steps of the V-shaped model).2525 3.1.1.Item definition .2525 3.1.2.Safety Analysis .2626 3.1.3.Safety Des
10、ign and Safety Concept .2626 3.1.4.System development .2626 3.1.5.Examination and validation of the sub-system and the vehicle .2727 3.1.6.Safety assessment .2727 3.1.7.Final check process before release .2727 3.1.8.Incident management.2727 4.Scenario structure .2828 4.1.Traffic disturbance scenario
11、 .2828 4.1.1.General vehicle scenario .2828 4.1.2.Scenarios unique to motorcycles .3636 4.1.3.Scenarios resulting from the combination of behaviours by several vehicles .3636 4.2.Perception disturbance scenarios .3737 4.2.1.Perception disturbance scenarios .3737 4.2.2.Blind Spot Scenarios .6464 4.2.
12、3.Communication disturbance scenario .7373 4.3.Vehicle motion disturbance scenarios .7676 4.3.1.Classification of vehicle body input .7676 4.3.2.Classification of tyre inputs .7878 4.3.3.Predictable vehicle motion disturbance safety approach.8080 5 Scenario Database .9090 5.1 Three layers of extract
13、ion .9090 5.2 Database parameters,format,and architecture .9090 5.3 Test scenario database interface specification .9191 Annex A Road Geometry .9393 A.1 Road geometry component elements .9595 A.2 Basic parameters of road geometry .9696 A.3 Update with actual environmental data .9797 A.4 Updating roa
14、d geometry parameters based on actual world map data .9797 (C)Copyright Japan Automobile Manufacturers Association,Inc.,All rights reserved.iv Annex B Scenarios for Motorcycles .9999 B.1 Classification of surrounding motorcycle location and motion .9999 B.2 Traffic disturbance scenario unique to mot
15、orcycles .9999 Annex C Approach for complex scenarios of traffic disturbance .101101 C.1 Concept of avoidance motion scenario .101101 C.2 Traffic flow scenarios .101101 C.2.1 Avoidance trigger .102102 C.2.2 Avoidance space .102102 C.2.3 Cut-in vehicles into the avoidance area .104104 C.2.4 Road envi
16、ronment .104104 Annex D Verifying the completeness of scenario database based on accident data .106106 D.1 German In-Depth Accident Study(GIDAS)data .106106 D.2 Pre-crash scenario typology for crash avoidance research(NHTSA).107107 D.3 Institute for Traffic Accident Research and Data Analysis(ITARDA
17、)data .107107 Annex E Principle models and evaluation scenarios of perception disturbances .111111 E.1 Processes of principle models description and evaluation scenario derivation .111111 E.2 Principle models and evaluation scenarios of mmWave Radar .112112 E.2.1 mmWave Radar Large difference of sig
18、nal(S)(recognition target).113113 E.2.2 mmWave Radar Low D/U (road surface multipath).119119 E.2.3 mmWave Radar Low D/U (change of angle).123123 E.2.4 mmWave Radar Low S/N (direction of a vehicle).128128 E.2.5 mmWave Radar Low D/U (surrounding structures).132132 E.3 Principle models and evaluation s
19、cenarios of LiDAR .136136 E.3.1 LiDAR Attenuation of signal(recognition target).137137 E.3.2 LiDAR Noise .146146 E.3.3 LiDAR Signal not from recognition target(reflection from raindrops).154154 E.4 Principle models and evaluation scenarios of Camera .160160 E.4.1 Camera Hidden(image cut out).161161
20、E.4.2 Camera Low spatial frequency/low contrast(caused by spatial obstruction).171171 E.4.3 Camera Overexposure .184184 Annex F Guideline for validation of virtual environment with perception disturbance .193193 F.1 overview of requirements defined in this Annex .193193 F.2 Common requirement and re
21、productivity validation method.194194 F.2.1 the way of thinking about common requirement .194194 F.2.2 The way of thinking about common requirement for each sensor .196196 F.2.3 Validation method of common requirement .204204 F.3 perception disturbance reproducing requirement and reproductivity vali
22、dation method .233233 F.3.1 the way of thinking about perception disturbance reproducing requirement .233233 F.3.2 The way of thinking about perception disturbance reproducing requirement for each sensor .233233 (C)Copyright Japan Automobile Manufacturers Association,Inc.,All rights reserved.v F.3.3
23、 Validation method of perception disturbance reproducing requirement .239239 Annex G Validation of Simulation Tools and Simulation Test Methods Related to UN Regulation No.157 .259259 G.1 Purpose and Scope.259259 G.2 Terminology.259259 G.3 Method for Validating the Simulation Tool .260260 G.3.1 Purp
24、ose of This Chapter .260260 G.3.2 Validation Method and Criteria .260260 G.3.3 Simulation Tool Requirements .261261 G.4 Procedure for Validating the Simulation Tool .262262 G.4.1 Purpose of This Chapter .262262 G.4.2 Procedure for Validating the Simulation Tool .262262 G.5 ADS Safety Performance Eva
25、luation Simulation Method .263263 G.5.1 Purpose of This Chapter .263263 G.5.2 Test Method .263263 G.5.3 Definition of the Parameters of the Ego and Other Vehicles .264264 G.5.4 Definition of Each Scenario .265265 G.5.5 Criteria for Pass or Fail .265265 G.5.6 Parameter Range for Simulations .266266 G
26、.5.7 Conducting Simulation .269269 G.6 Submission Documents .271271 1 Main changes and additions to Ver.3 Traffic disturbance scenarios The motorway-specific content has been revised to include general roads,and a traffic disturbance scenario for general vehicles that includes general roads has been
27、 added along with the addition of ITARDA data to Annex D.Perception disturbance scenarios The content of Annex E and F has been added.Vehicle motion disturbance scenarios Preventability/unpreventability boundary conditions have been added for general roads.1.Positioning of this Paper【Background】The
28、realization and deployment of autonomous driving(AD)is expected to bring forth an even safer society which is also more efficient and with a freer mobility.The fulfillment of these expectations is a major global challenge that stands on the sufficient safety assurance and verification of the autonom
29、ous vehicles both in terms of performance and technology.In this document,the Japan Automobile Manufacturers Association Inc.(JAMA)has summarized the best practice on safety argumentation structuring,safety evaluation,and safety assessment methods needed to enable logical completeness,practicability
30、,and transparency of AD safety.The safety assessment and the technical judgment may be revised according to the practical implementation and evolution of the AD safety assurance dialogue,along with technical content modifications.【Aims】To enhance safety and efficiency of AD systems development by pr
31、oviding guidelines that serve as a common ground for each JAMA member at each product development stage,from planning and design,to evaluation.To gain a common technical understanding when international regulations and standards are formulated.To clarify JAMA position when cooperating with internati
32、onal projects.2 2.Automated Driving System Safety Argumentation Structure An overview of the safety argumentation structure for AD systems with SAE automation level 3 through to level 5 is provided in this chapter.2.1.Issues with existing approaches 2.1.1.Safety evaluation through long-distance/long
33、-duration driving tests Long-distance/long-duration driving test strategies aim at ensuring safety by randomly indentifying malfunctions and unintended disengagements in a black box-type manner,until a certain value for a probabilistic metric is guaranteed.These strategies,applied as a safety evalua
34、tion process,present issues both in terms of evaluation scope sufficiency and of explainability in emergencies.The main issue related to evaluation scope sufficiency relates to the stochastical increase of factors and associated hazards with driving distance and time.In other words,it is not possibl
35、e to ensure that hazards due to factors not identified in long-distance/long-duration runs will not occur after release.Further,within a contex in which there is neither legal nor social consensus on criteria based on driving distance or time,the issue on explainability in emergencies relates to the
36、 impossibility of clarifying social responsibility for emergency interventions when hazards are encountered by the system.Probabilistic safety criteria based on long-distance/long-time driving also present problems from a technical development point of view,due to the inefficiency of identifying fac
37、tors that dependend on the environmental conditions in which the driving was conducted,as well as on the characteristics of the vehicle.2.1.2.Data storage/classification scenario-based approach A number of countries are actively developing data driven scenario-based approaches to address the challen
38、ges of applying previous ADAS development processes for safety assurance of AD systems of SAE automation level 3 through to level 5.These approaches incorporate normal traffic and accident data,process the data,and systematically categorize the processed information into formats known as scenarios w
39、hich are stored in a database.The collection,storage and creation of such scenarios and database in the public domain,free from manufacturers intellectual property and bias,may enable the development a safety evaluation ecosystem,that both certification bodies and manufacturers could incorporate for
40、 the benefit of the general public through safer vehicles.However,the scenario based approach does not resolve per se the aboved mentioned issue concerning evaluation scope sufficiency before release.When the obtained data is tagged and categorized,the compensation for the phenomenon that may occur
41、in the future still depends on the distance and time or the amount of data,so the previously mentioned issue related to evaluation scope sufficiency remains unresolved.Further,if the driving data shared in the public domain is only comprised of images and vehicle trajectories this will lead to insuf
42、ficient safety verification range,as such data may exclude factors related to autonomous vehicles misinterpretaion of both the surroundings and its own conditions,as well as factors possibly affecting vehicle stability.3 2.2.Overview of Physics Principles Approach Process In order to address the lim
43、itations of existing approaches concerning evaluation scope suffciency and explainability in emergencies,a Physical Principles Approach Process for safety evaluation is proposed.This proposal essentially incorporates physics principles into a scenario-based approach.The number of safety-relevant sit
44、uations that an AD system may encounter in real traffic is infinite.Therefore,if scenarios are structuralized by solely combining traffic factors without further considerations,the unlimited number of variables that need to be considered will prevent from a complete scope verification.In contrast wi
45、th the infinite number of safety-relevant situations that an AD system may encounter in traffic,the number of physics principles that the system can apply for safely handling such situations is limited.AD systems decompose all DDT into perception,judgement and operation subtasks,and each of these su
46、btasks is associated with one or several specific physics principles.Therefore,if scenarios are decomposed and structuralized logically in consideration of the physics of the AD system,then it is possible to provide a complete coverage of all the safety-relevant root causes for given DDT.This motiva
47、tes the incorporation of perception,traffic situation,and operation related disturbances,and the corresponding scenario structures introduced in the following table,in Figure 1 and Figure 2,and elaborated in detail in following chapters.Task Processing results Disturbance Governing physics principle
48、s Perception Own position,surrounding traffic environment positional information and other traffic information Perception disturbance Light,radio wave,infrared light propagation principles that affect camera,mili-wave radar and LiDAR sensors,respectively Judgement Path,speed plan instructions Traffi
49、c disturbance Kinematics describing the motion of traffic participants,objects and systems of groups of objects,without reference to the causes of motion.Operation Movement instruction allocation for each ACT for achieving path and speed plan instructions Vehicle control disturbance Dynamics,concern
50、ed with forces applied on the vehicles body and tires,and their effects on motion.Figure 1.different categories of structuralized scenarios considering physics principles for each corresponding perception,judgement and control tasks 4 Figure 2.Schematic of the three disturbance categories considered
51、 to logically structuralize scenarios Perception disturbance refers to conditions in which the sensor system may fail to correctly judge a hazard or a non-hazard for sensor or vehicle intrinsic or extrinsic reasons.Examples of intrinsic reasons include part mounting(e.g.unsteadiness related to senso
52、r mounting or manufacturing variability),or vehicle conditions(e.g.vehicle inclination due to uneven loading that modifies sensor orientation,or sensor shielding with external attachments such as bicycle racks).External reasons include environmental conditions(e.g.sensor cloudiness,dirt,light,etc.)o
53、r blind spots induced by surrounding vehicles.Traffic disturbance refers to traffic conditions that may lead to a hazard resultant of a combination of the following factors:road geometry(e.g.,branch),ego-vehicle behaviour(e.g.,lane change),and surrounding vehicle location and action(e.g.cut-in from
54、a near side vehicle).Vehicle disturbance refers to situations in which perception and judgement work correctly but where the subject vehicle may fail to control its own dynamics.This can be due to intrinsic vehicle factors(e.g.total weight,weight distribution,etc.)or extrinsic vehicle factors(e.g.ro
55、ad surface irregularities and inclination,wind,etc.).Collected normal traffic and accident data can be used to confirm possible gaps in terms of whether situations actually occurring in real traffic are being missed by the logically created scenario systems.Further,by assigning probabilistic ranges
56、to physical parameters for each qualitative scenario category,the data and scenarios can also be used to show in a downscaled manner,to what extent certain situations actually occur.Scenario StructureTraffic DisturbancePerception DisturbanceVehicle DisturbanceTraffic participants unsafe behaviorSens
57、ing/Localize/Communication limitationCause of vehicle instability 5 2.3.Safety Argumentation Structure Framework 2.3.1.Automated driving safety principles The WP29 document for the harmonisation of international regulations on automated driving reads Automated vehicles shall not cause any non-tolera
58、ble risk,meaning that,under their operational domain,shall not cause any traffic accidents resulting in injury or death that are reasonably foreseeable and preventable(UN/WP29,2019,WP29-177-19,Framework document on automated/autonomous vehicles).These definitions allow to contextualize the safety ph
59、ilosophy of the current methodology proposed,with respect to safety principles that international policy makers are applying in the form of a matrix(Figure 3).Considering the two conditions of foreseeability and preventability together generates a 4 quadrant matrix that better contextualises the phi
60、losophy of this document.Scenario based safety evaluation,can be found in the top left quadrant of the matrix where no accidents are acceptable.This quadrant accounts for all scenarios for which an accident is foreseeable and preventable.The bottom left quadrant of the matrix depicts the traffic sit
61、uations that can not be foreseen but that can be prevented.The cases that fall under this category form the basis for learning and serve as a precedent for future generation AD system developments.The top right quadrant of the matrix introduces cases that are foreseeable but not preventable.The situ
62、ations that fall under this category are situations for which mitigation is the only option.Measures to reduce the damage resultant of these unpreventable(yet foreseeable)cases constitutes the main area of focus in this section.The final quadrant(bottom right)accounts for crashes that are neither fo
63、reseeable nor preventable.In these situations,resilience support in the form of legalities,the division of responsibilities,health support,insurance and other such areas need to be the focus of attention.Figure 3.Safety approach in context with foreseeability and preventability matrix 6 2.3.2.Scope
64、of safety evaluation Figure 4 presents a summary of the safety aspects described in the WP29 framework document organized hierarchically.With the common top level safety goal of achieving systems free of unreasonable safety risks,the scope of the current proposal is limited to Validation for System
65、Safety(highlighted in pink).The validation for system safety according to the safety vision framework can be further decomposed as shown in Figure 5.The scope of the current proposal is limited to critical conditions,and excludes Pre critical conditions.The reason for this exclusion is that,in situa
66、tions in which there may be a potential risk(e.g.frontal vehicle carrying a load that may fall on the road),may induce many actuations that are not motivated by real risks and that alter traffic imposing risks on other participants(e.g.braking frequently despite not being a real risk).Therefore,to a
67、ddress pre-critical situations,rather than applying physics principles approach processes,other means to verify if the vehicle follows traffic rules and keeps sufficient distance with surrounding objects Figure 4.Safety Aspects Hierarchy Diagram Figure 5.Safety argumentation structure diagram 7 2.3.
68、3.Method of evaluating safety The main DDT safety risk is to collision with the surrounding traffic participants or obstacles,which is systematized through traffic disturbance scenarios.By defining quantified ranges of reasonable foreseeability and preventability for each of these traffic disturbanc
69、e scenarios,quantitative criteria associated to each test are defined.Based on these traffic related hazardous scenarios,it is then possible to expand the evaluation to incorporate perception-and vehicle stability-related hazardous scenarios into the assessment which will enable a comprehensive safe
70、ty evaluation(Figure 6).Figure 6.Overview of method of judging safety 8 2.3.3.1.Traffic disturbance safety evaluation method Traffic disturbance is the position and actions of traffic participants existing around your own vehicle that prevent safe driving by your own vehicle.As previously described,
71、the basic thinking behind safety principles is to equip the automated driving system with higher level avoidance performance than a competent and careful human driver within a foreseeable range.For this thinking,we need to define and model the performance of a competent and careful drive applied to
72、traffic disturbances.By implementing this defined model in a simulation program and deriving the actual scope avoidable for a competent and careful human driver,it is possible to define safety standards in relation to traffic disturbances.Figure 7.Overview of traffic disturbance safety judgement met
73、hod The competent and careful human driver performance model definition(Figure 8)is able to define the three elements of perception,judgement,and operation.It is important to have objective grounds for defining parameter coefficients related to performance shown in the respective segments.Figure 8.C
74、ompetent and careful human driver model Here,the driving action elements of judgement and operation are explained.The main avoidance actions of automatic driving in relation to traffic disturbances are considered to be the brake operation(deceleration action)and,regardless of the type of traffic dis
75、turbance(position and action of the traffic participants surrounding the ego vehicle),this is fulfilled by defining the performance of a competent and careful human driver.Figure 9 9 shows a diagram which demonstrates the brake operation of a competent and careful human driver.The model on the left
76、shows the braking operation made by a competent and careful human driver.The model on the right is a functional model of the collision damage mitigation braking system(AEB:Advanced Emergency Braking),it considers the amount of improvement in avoidance performance when equipped with AEB.Figure 9.Comp
77、etent and careful human driver brake model Perception response time,the time delay from the moment when a competent and careful human driver perceives risk to the time that deceleration braking force occurs is set at 0.75 s.This time set is used by police and domestic courts in Japan when establishi
78、ng a drivers“perception response time”.In terms of maximum deceleration force,quoting the Japanese test data shown in Figure 10,is 0.774G.Whereas the brake force generated by normal drivers in emergencies is 0.689G,normal drivers who have received training in driving techniques have a braking force
79、of 0.774G;albeit this is defined as a higher skill value compared to ordinary drivers.Furthermore,from the accident statistics data from NHTSA(Figure 11),0.74G is the peak value;therefore,the maximum deceleration of 0.774G applied to the competent and careful human driver model can be considered app
80、ropriate.Figure 10.Emergency brake characteristic Figure 11Maximum deceleration due to deceleration of the preceding vehicle Figure 12 shows a waveform diagram of deceleration braking for drivers who have received driver skill training.This quotes the Japanese test data previously described.In this
81、waveform diagram,the time for reaching the maximum deceleration is demonstrated,and the maximum deceleration arrival of a competent and careful human driver is defined as 0.6 s.10 Figure 12.Emergency brake characteristics study example(arrival time until maximum deceleration)11 2.3.3.1.1.Cut-in scen
82、arios Cut-in scenarios are scenarios in which vehicles travelling in an adjacent lane to the ego vehicle cuts in front of it.Figure 13 shows a schematic expressing boundary conditions where a competent and careful human driver judges it risky when another vehicle cuts in in front of the ego vehicle.
83、Figure 13.Cut-in judgement conditions and danger judgement boundaries The boundary conditions when it is judged that a vehicle travelling in the adjacent lane has cut in front of the ego vehicle are defined as the cut-in vehicle lateral movement distance(wander amplitude).In an actual driving enviro
84、nment,vehicles driving while maintaining their lane will wander a little to the left or right while driving.In the scope of the wander lateral movement distance,it is unlikely that the vehicle traveling in the adjacent lane of the ego vehicle travels whith a recognition that it will cut in.Therefore
85、,the cut-in perception boundary conditions were defined from the lateral distance movement(wander amplitude)distribution(Figure 14)of vehicles changing lanes based on the data observed in the actual traffic environments.After the cut-in judgment,the boundary conditions for perceiving risk for the eg
86、o vehicle and perceives a need for the emergency brake(risk perception boundaries)can be defined by multiplying the maximum lateral velocity derived from the actual traffic observation data by the risk perception response time.When calculating the risk perception response time,test data using a driv
87、ing simulator carried out in Japan was utilised and analysed.The prerequisites for the test are shown in Figure 16.Figure 14.Actual observation statistics for stagger amplitude Figure 15.Maximum lateral velocity observation data statistics 12 Figure 16.Assumptions for driving simulator tests The tes
88、ts measured the drivers response(reaction time,avoidance operation)for cut-ins from 20 other regular drivers(Table 1).The measurements were performed twice on each participant;by comparing the respective average values of the first and second time,we derived the time until risk was perceived.Table 1
89、.Test participant attributes The test results are shown in Figure 17.The results demonstrated that the time from the start of the cut-in from the other driver to when risk was perceived was 0.8 s for the first time and 0.4 s for the second time.Based on these test results,with the first time percept
90、ion,the cut-in time is required by the other driver and the time for risk to be perceived,whereas the second time because they were driving while being wary of the cut-in,the time for identifying the cut-in from the other vehicle was not required.However,even when the driver was aware,time was still
91、 required for determining risk(Figure 18),and the time until risk was perceived was defined as 0.4 s.Figure 17.Driving simulator test results 13 Figure 18.Relationship between cut-in identification time and danger judgement time As described above,the risk judgement boundary is defined as the time w
92、hen multiplying the maximum lateral velocity,and the time until perceiving risk.The maximum lateral velocity of 1.8 m/s calculated from the actual traffic observation data and the time until risk is perceived and calculated from the driving simulator test results of 0.4 s are multiplied.Therefore,th
93、e risk perception boundary is defined as 1.8 0.40.72 m.When the cut-in perception condition and risk evaluation boundary area applied to the diagram in Figure 8,it results in Figure 19.Figure 19.Competent and careful human driver model(Cut In)14 According to the UNR collision warning guidelines,the
94、boundary that requires emergency action is defined as TTC=2.0 s regarding the longitudinal(distance from the other vehicle)risk evaluation boundary(Figure 2).This is cited to define the longitudinal risk evaluation boundary as TTC=2.0 s.Figure 20.UNR collision warning guidelines(Citation)2.3.3.1.2.C
95、ut-out Scenario The cut-out scenario is a scenario in which the leading vehicle that the ego vehicle is following suddenly changes its lane to the adjacent lane(cut-out).This scenario evaluates safety in relation to the sudden appearance of a decelerating or stopped vehicle(such as broken-down car a
96、nd the tail end of a traffic jam)in front of the ego vehicle due to the preceding vehicles cut-out.Figure 21 shows the schematic that represents the boundary condition for the competent and careful human driver who perceives the situation to be risky when the preceding vehicle performs a cut-out.Fig
97、ure 21.Cut-out perception condition and risk evaluation boundary The cut-out perceived boundary condition to perceiving the preceding vehicles cut-out manoeuvre is defined by the amount of lateral movement(drifting amplitude),which is similar to the case with the aforementioned cut-in perception con
98、dition.Both the cut-in and cut-out are maneuvres to change lanes.Similar to the case of cut-in,the boundary condition using the distribution of drifting amplitude from the observation data of real traffic is applied to the perception condition of cut-out.Moreover,the time from the cut out perception
99、 to the recognition of the vehicle ahead that appears and the risk perception is defined as 0.4 sec based on the experimental data(Figure 17 and 18).15 Figure 22.Competent and careful human driver model(cut out)2.3.3.1.3.Deceleration Scenario A deceleration scenario takes into consideration the sudd
100、en deceleration of the leading vehicle that the ego vehicle is following.Although the previous cut-in and cut-out scenarios required the perceived lane change boundaries from the following or leading vehicle,the deceleration scenario only involves the longitudinal behaviour.Therefore,it is only nece
101、ssary to define the deceleration perception time by the leading vehicle to evaluate the risk boundary.Similar to the preceding case,0.4 s can be applied as the time required to evaluate the risk.Figure 23.Risk evaluation boundary in deceleration scenario When the risk evaluation condition of the dec
102、eleration scenario is applied to the diagram in Figure 8,it results in Figure 24.Figure 24.Competent and careful human driver model(Deceleration)16 Definition of Parameters for Deriving Standard The following table lists the parameters required for deriving the safety standards for traffic disturban
103、ces.The evaluation scenarios related to traffic disturbances are generated by defining road geometry,the ego vehicles behaviour,and locations and motions of the surrounding traffic participants.The parameter items required in the evaluation scenario are categorized in a specific numerical range,and
104、the Pass/Fail boundary is derived within that range.Table 2.List of traffic disturbance parameters.Operating conditions Roadway#of lanes=The number of parallel and adjacent lanes in the same direction of travel Lane Width=The width of each lane Initial condition Initial velocity Ve0=Ego vehicle Vo0=
105、Leading vehicle in lane or in adjacent lane Vf0=Vehicle in front of leading vehicle in lane Initial distance dx0=Distance in longitudinal direction between the front end of the ego vehicle and the rear end of the leading vehicle in ego vehicles lane or in adjacent lane dy0=Inside Lateral distance be
106、tween outside edge line of ego vehicle in parallel to the vehicles median longitudinal plane within lanes and outside edge line of leading vehicle in parallel to the vehicles median longitudinal plane in adjacent lines.dy0_f=Inside Lateral distance between outside edge line of leading vehicle in par
107、allel to the vehicles median longitudinal plane within lanes and outside edge line of vehicle in front of the leading vehicle in parallel to the vehicles median longitudinal plane in adjacent lines.dx0_f=Distance in longitudinal direction between front end of leading vehicle and rear end of vehicle
108、in front of leading vehicle dfy=Width of vehicle in front of leading vehicle doy=Width of leading vehicle dox=Length of the leading vehicle Vehicle motion Lateral motion Vy=Leading vehicle lateral velocity Deceleration Gx_max=Maximum deceleration of the leading vehicle in G dG/dt=Deceleration rate(J
109、erk)of the leading vehicle 17 2.3.3.1.4.Calculation of Boundary As discussed above,the specific standard value can be derived by the numerical calculation of the competent and careful human driver model.The parameter region for the standard value derivations are set to allow combinations of every pa
110、rameter within the maximum vehicle velocity region allowed by the ADS to be targeted.2.3.3.1.4.1.Derivation result of the preventable boundary of cut-in scenario The safety standard of the cut-in is derived for every relative velocity between the ego vehicle and the counter vehicle.Collision with th
111、e cut-in vehicle is not allowed in the parameter region indicated by the green area in Figure 26.Figure 25.Conceptual diagram of cut-in scenario parameters Figure 26.Preventable boundary data sheet of cut-in scenario 18 2.3.3.1.4.2.Derivation result of cut-out scenario standard The cut-out safety st
112、andard requires that all decelerating(stopped),vehicles located ahead of the vehicle cut-out,must be able to avoid collisions.This standard is derived by making the aforementioned competent and careful human driver model follow the leading vehicle at THW=2.0 s.This value,i.e.,THW=2.0 s,is applied by
113、 referring to the laws and instructions of each country.Figure 27.Conceptual diagram of cut-out scenario parameters Figure 28.Preventable boundary data sheet of cut-out scenario 19 2.3.3.1.4.3.Derivation result of preventable boundary of deceleration scenario The safety standards for deceleration sc
114、enarios are required to enable avoidance of collision with the suddenly decelerating vehicle at 1.0 G or less or by stopping the vehicle.This standard is derived by making the aforementioned competent and careful human driver model follow the leading vehicle at THW=2.0 s.This value,THW=2.0 s,is appl
115、ied by referring to the laws and instructions of each country.Figure 29.Conceptual diagram of decelerating scenario parameters Figure 30.Preventable boundary data sheet of decelerating scenario NOTE:Preventable boundary does not show up at 60 km/h or less because the braking force is sufficient.20 2
116、.3.4.Safety evaluation method for perception disturbance The basic conception of safety standard is as follows:To avoid collisions in any of the traffic disturbance scenarios,even when experiencing perception disturbances.When considering that lane deviation can also contribute to collisions,the per
117、ception of objects is necessary to avoid collisions with objects on the runway(Fig.31).Moreover,there are two types of phenomena that result from the perception disturbance,namely,a false negative where the existing objects are not correctly detected,and a false positive where objects that do not ex
118、ist are falsely detected(Figure32).Figure 31.Types of detection target Figure 32.Detection result caused by disturbance Difficult to detect target Difficult to detect lane 21 When these are combined,evaluations based on the concept of safety standards become necessary for four categories of situatio
119、ns in total(Figure 33).Figure 33.Four categories of detection disturbance situation The following is considered within the ODD region as the parameter region of perception disturbance to define an appropriate region for each disturbance factor.1:Road structure,Road Traffic Law and other regions defi
120、ned by laws and regulations.(e.g.:When visibility is 50 m or less,the road is closed,i.e.,a level difference of 15 cm on the road surface must be repaired)2:Region that is determined to be possible at certain probability based on statistical data.(e.g.,precipitation,brightness,and sun altitude,etc)M
121、oreover,this safety standard is not the performance standard allocated to an individual sensor.Instead,it should complement the entire recognition system installed.The above flow of safety perception can be summarized as follows.22 Figure 34.Safety assessment of perception disturbance detection flow
122、 2.3.5.Safety evaluation method for vehicle disturbance A vehicle disturbance indicates sudden disturbances(e.g.puddles or sudden gust of wind).Although these are unpredictable phenomena,drivers can safely drive by following common sense related to road design,road maintenance/management and road en
123、vironmental conditions.Thus,the premise of driving on common roads is that the roads are constructed by responsible public or private organisations which follow basic principles such as legality,ethics and engineering and are always maintained and managed.Most countries have road structure ordinance
124、s and guidelines for road maintenance and repair to ensure that the road geometry design enables safe driving by every person with a valid driving license(regardless of their driving skill,reflexes,or age).Moreover,when there is a risky situation,such as freezing or a sinkhole,that can hinder drivin
125、g,the road administrator is obliged to warn the drivers in advance,e.g.,with a traffic sign.Based on these preconditions,a technical safety approach for foreseeable vehicle disturbances is introduced.As shown in Figure 6,collisions must be avoided in any of the traffic disturbance scenarios,even whe
126、n experiencing vehicle disturbance.In the current standards,the collision avoidance strategy under the foreseeable and avoidable scenarios and collision mitigation strategies for predictable but unavoidable scenarios are of particular consideration.Henceforth,when a vehicle behaviour changes because
127、 of a vehicle disturbance within the scope of avoidable conditions,the AD vehicle is required to possess a controllability that can stabilise the vehicle without halting driving.However,when these disturbances cause instability that cannot be avoided,the AD vehicle must adapt to the best effort stra
128、tegy to mitigate the possible collision.Figure 35 shows a specific example of the safety approach for foreseeable vehicle disturbances.The upper section of the figure represents an example of the AD vehicle experiencing a rapid decrease of sliding friction while staying within the avoidable conditio
129、ns on a wet road;in such a state,the vehicle must be able to be safely controlled without interrupting the driving process.However,the lower section of the figure represents an example involving an AD vehicle equipped with summer tires encountering a frozen road,which causes a rapid decrease of slid
130、ing friction and generates a vehicle state that was defined to be unavoidable in advance(e.g.,maximum deceleration).Therefore,the safety approach toward vehicle disturbances is based on the principle and clear definitions of vehicle motion engineering related to the definitions of the states where t
131、he vehicle is controllable and the states where the vehicle is uncontrollable.(Section 4.3.3 for detail).23 Figure 35.Safety approach for avoidable(above)and unavoidable(below)vehicle disturbance When these considerations are combined with traffic disturbances,the safety of the AD vehicle does not a
132、ffect the test result if the stability of the vehicle is maintained.Moreover,while wind affects other vehicles,it only influences the lateral velocity as with cut-in,and it is included in the original traffic flow parameters.The safety standards for vehicle motion disturbances are evaluated relative
133、ly without including the vehicle disturbance to the traffic flow scenario.Therefore,the safety standards for vehicle disturbances only need to set the most strict condition under the premise that the Road Traffic Act is strictly adhered.Drivers are responsible for the maintenance of their vehicles,t
134、he road administrator is appointed as per the Road Traffic Act,and roads are managed and operated according to the Road Structure Ordinance and guidelines for road maintenance and repair,and perception standards do not departing from the road surface.As an example,the disturbance factors and conditi
135、ons for motorways in Japan(refer to 4.3.3.8 for general roads)are listed below:Road surface state:Friction coefficient is 0.3(lock)or more,external force on the tires is at the set point of the road maintenance and repair or less(e.g.:rut:25 mm,level difference:30 mm,pothole:20 cm)Road geometry:Curv
136、e within the regulation of the road structure ordinance,i.e.,R=460 m,vehicle velocity is 100 km/h Natural phenomena:Wind speed of lateral wind without speed control is 10 m/s,i.e.,vehicle velocity is 100 km/h As the most difficult condition here is when the abovementioned disturbances all simultaneo
137、usly occur,these three factors are added up for evaluation(Figure 36).24 Figure 36.Vehicle motion disturbance evaluation conditions The perception condition under this situation is to avoid departure from the lane.Here,the cases where the vehicle cannot drive under these conditions(e.g.,when lateral
138、 wind is 5 m/s or more,i.e.,driving is not possible)must be defined in advance as ODD by the manufacturer.Furthermore,as a functional requirement,the slow puncture that occurs while driving should be managed before the vehicle becomes uncontrollable(before the rim touches the surface of the road).Th
139、e summary of the flow of safety perception discussed to date is listed below.Figure 37 Safety perception flow of vehicle motion disturbance 25 3.Scenario-Based Safety Assurance Process Figure 38 shows the schematics for the overall safety argumentation system in development and production cycle base
140、d on the V-shaped model,which is the project management commonly appointed to the development of advanced driving assistance systems(ADAS)and AD systems.By integrating verification to the sensor setup assessment and software agility basement processes from the planning phase in the first half of dev
141、elopment,rather than conducting it only during the latter half of development represented by the right side of the V-shape,it can contribute to the optimisation of the development.Figure 38.Overall scheme of safety assurance process 3.1.Safety argumentation scheme(Steps of the V-shaped model)3.1.1.I
142、tem definition The safety argumentation process is for making the vehicle compatible with the safety target within the operation scope of the automatic driving vehicle that was determined in advance.The operation scope of automatic driving vehicles is defined at the initial stage as the operation de
143、sign scope(ODD).The contents of the ODD must include,at a minimum,information such as the road type,position on the road,vehicle velocity scope and environmental condition.Moreover,a fallback strategy for transition to outside the ODD boundary must be designed;moreover,the AD system must detect whet
144、her it is operating within the defined ODD.The definition of OD must be structured in such a manner as to enable notification to the users,as well as allow them to understand,trust and operate the AD system(Khastgir,Birrell,Dhadyalla,&Jennings,2018).Note that by mapping the ODD system and the scenar
145、io system as shown in Figure 39,it becomes possible to select the evaluation scenario following the ODD range.Safety Design&ConceptSafety AnalysisSystemDevelopmentSubsystem and vehicle V&VSafety AssessmentSocially acceptable top safety goals defined by authorities Item DefinitionTest scenario DBBCon
146、vertConvertFunctional ScenarioTest CaseConcreteScenarioLogicalScenarioTraffic Environment DataParameterrangeParameterDistributionProcessProcessExpert knowledgeFinal development sub-process before customer operationForeseeable Coverage goalsPreventable Coverage goalsIncident management 26 Figure 39.O
147、DD scenario classification and relationship diagram of the system level classification based on the three category scenario level 3.1.2.Safety Analysis It is important to determine as many foreseeable scenarios as possible,as well as systematise detailed scenario-related information on the operation
148、 design scope(ODD),vehicle and its surrounding,technically comprehensive definition of ODD based on the system physics,in addition to the overall definition of ODD that employs the systematic combination approach.For instance,the word rain is enough for communicating with the user if rainfall condit
149、ions are included in the ODD;however,the AD system itself cannot interpret such a concept in the same manner.This scenario is able to consider the influence of rain from the perspective of system physics instead such as the possibility of the influence of raindrops on the sensor performance or the i
150、nfluence of rain on the vehicle dynamics(e.g.,decrease in friction coefficient between the tire and the wet road surface).To describe ODD in a technical and system-oriented way,it is classified into three categories related to the system physics in order.These categories cover the respective percept
151、ion,traffic flow and vehicle disturbances that can potentially occur within the AD system safety analysis(Figure 2).3.1.3.Safety Design and Safety Concept The system requirements should be produced based on the safety analysis steps.The safety target defined by our association is integrated into the
152、 development cycle during this process,as well as confirmed during the system design.As layers of different complexity are added to the safety design,the safety analysis cycle can be unified as per necessity between this process and the preceding process as long as their outputs follow the safety an
153、alysis steps.It is important to ensure compatibility between the ODD and the system requirements to avoid unnecessary specification changes in the system development process.This indicates the importance of the role of the safety analysis step.3.1.4.System development When the system design is compl
154、ete and its safety is analysed,the actual system that includes the component elements of both software and hardware is developed.27 3.1.5.Examination and validation of the sub-system and the vehicle At this point,the strategy for safety examination and validation of the system and the vehicle is def
155、ined without interaction with the driver.The examination and validation are conducted by combing concentrated virtual evaluations and a relatively limited amount of physical tests in real traffic environments and at test courses.The mathematical and physical accuracy of the system,development functi
156、ons,and employed safety measures are verified in the sub-process of the examination.Moreover,verification is performed in regard to whether all the safety specifications and requirements drawn up during the safety analysis process(sufficiency of sensors,algorithm and actuator-related measures)have b
157、een satisfied.For the validation sub-process,verification is performed in terms of whether the system and components,including the employed safety measures,pose an irrational risk to the traffic participants.Moreover,the safety of the AD system is substantiated by confirming that the defined validat
158、ion targets were met.3.1.6.Safety assessment The test for determining whether the end product is acceptable is conducted during this step,which includes the related inspections,document checks and certifications.3.1.7.Final check process before release In the final check before release,verification
159、is performed in terms of whether the safety of the AD system can be explained,in addition to whether the remaining risk is within the permissible range.This can be conducted by,e.g.,using technologies such as the behaviour safety assessment(BSA),which focuses on the evaluation of the AD system at ea
160、ch test case by applying different measurement standards and confirms the compatibility of AD with predefined behaviour standards.Finally,a determination is made in terms of whether the system can be released during the review of the result,and then the post-release incident management strategy is d
161、esigned.3.1.8.Incident management During the incident management process,the performance data is fed back into the safety argumentation process.This enables the improvement of the AD technology and reduces the number of unforeseeable situations as time passes.It is expected that,because of this redu
162、ction,the threshold between two left quadrants shifts,as well as the boundary between them will be lesser in the way that is beneficial to the foreseeable scenarios(Figure 40).Following the same logic,it is expected that the boundary between the preventable scenarios and unpreventable scenario shift
163、s rightward,and the quadrant on the upper left will expand.It is highly possible that this will occur as more scenarios become preventable.Figure 40.Expansion of foreseeable and preventable scopes following the evolution of the AD system ForeseeableUnforeseeablePreventableUnpreventableScenario based
164、 approach for No accidentSocial acceptance or resilience Support for residual social risk Best effort functionality to mitigate the accidentLearning process based in field monitoring PreventableBoundaryBoundaryForeseeableUnforeseeableBoundaryUnpreventableBoundarySocial acceptance or resilience Suppo
165、rt for residual social risk Best effort functionality to mitigate the accidentLearning process based in field monitoring Scenario based approach for No accident 28 4.Scenario structure Every approach is constructed by applying the systematic combination approach for defining the combinations derived
166、 from all possible factors.This approach requires significant specialized effort for defining all the factors and their interdependency as was the case by examining the safety coverage target.Therefore,it requires a systematic standardization methodology for structuring every factor related to the i
167、nformation.As mentioned earlier,the structures of the scenarios are the possible disturbances that can occur in three different categories related to the physics of the system,namely,the perception disturbance,traffic disturbance and vehicle motion disturbance.4.1.Traffic disturbance scenario Traffi
168、c disturbance scenarios are classified as general vehicle scenarios(including automobile and motorcycles),motorcycle-specific scenarios,and vulnerable road user scenarios(Figure 41).These three scenario classifications are further generated by systematically analyzing and classifying the combination
169、s of different factors,namely the road geometry,ego-vehicle behavior,and the locations and motions of the surrounding traffic participants(Figure 42).Figure 41.Traffic disturbance scenario classification Figure 42.Structure of a traffic disturbance scenario NOTE:The vulnerable road user scenario wil
170、l be included in the next version.4.1.1.General vehicle scenario For traffic disturbance scenarios involving general vehicles,we provide specific explanations for the road geometry,ego-vehicle behavior,and the locations and motions of the surrounding traffic participants.4.1.1.1.Road geometry catego
171、ry The standard road is a non-intersection road (a).Merge zones(b)are formed when another road merges into a single road.When a single road splits,a branch zone(c)is formed.Furthermore,when one straight road intersects another straight road,an intersection(d)is formed(Figure 43).These roads are comb
172、ined to form various types of roads.Motorways are classified into three categories:main roads(non-intersection),merge zones,and branch zones,with intersections being excluded.The road scenario classification for scenario generation must be also discussed to make it applicable to highways internation
173、ally(Association,2004)(Transportation,2008;UK,2006).29 NOTE:Another type of road shape is a roundabout.For this,we must either consider a combination of merging and branching roads or prepare a separate scenario.In addition,we intend to include another Annex that considers parking lots and trams,amo
174、ng other scenarios.Figure 43.Road geometry classifications 4.1.1.2.Vehicle behavior category Vehicles move in a straight line along the lanes of road geometry(a)(also known as lane keeping).In addition,vehicles move between lanes from an adjacent and merging lane(b)(lane change).Here,while a lane ch
175、ange from an adjacent lane and a merging lane have different road geometry categories,as vehicle behaviors,both are considered to be lane changes.At intersections,the vehicle turns without changing lanes(right or left turn).Therefore,the possible vehicle behaviors are classified into three categorie
176、s:going straight,lane change,and turning.This vehicle behavior category is expressed using a combination of the road geometry information discussed above(Figure 44).NOTE:In addition to right and left turns,there is also the U-turn as a turning behavior,but the ADS will not perform a typical U-turn;h
177、owever,if a road is designed for U-turns,it is treated as a merge zone.Figure 44.Parameters of road geometry and vehicle behavior 4.1.1.3.Categories of positions and motions of surrounding vehicles Moreover,when there is a significant difference between the speeds of the leading vehicle and the vehi
178、cle in front of it,the leading vehicle might perform cut-out to avoid a collision.When a cut-out suddenly occurs,the ego vehicle might be required to take action to avoid a collision.To consider this scenario,the position of the vehicle in front of the leading vehicle is indicated as“+1”(Figure 45).
179、30 The neighboring positions in six directions around the ego vehicle that have a possibility of entering the driving trajectory of the ego vehicle,the left and right when entering from an intersection,and three oncoming directions,for a total of eleven directions,define the surrounding vehicles pos
180、itions that must be considered in a scenario structure.Moreover,if the speed difference between the leading vehicle and the vehicle in front of it is significant,the leading vehicle may perform a lane change(cut-out*1:Figure 46)to avoid a collision.If there is a sudden lane change,the ego vehicle ma
181、y need to take action to avoid a collision.To account for such a scenario,the position of the vehicle in front of the leading vehicle is considered and indicated as“+1”(Figure 45).An oncoming vehicle may also enter the lane of the ego vehicle by performing a lane change(mark under cut-in*2:Figure 46
182、).Figure 45.Positions of surrounding vehicles 31 Figure 46.The combination of the surrounding vehicle positions and the motions that can potentially obstruct the ego vehicle NOTE:In Ver 2.0,we placed other vehicles next to the ego vehicle;however,it has been eliminated.The reason for this is that th
183、e positions next to the ego vehicles would be covered depending on the initial positions of the vehicles in the front and rear(e.g.,positions 3 and 4),.The behaviors of the surrounding vehicles are classified into three categories:going straight(acceleration/deceleration),lane change(cut-in/cut-out)
184、and swerving(e.g.,behavior to avoid a stopped vehicle),and turning(right and left turn,U-turn).From a safety evaluation perspective,it is possible to minimize the number of evaluations by focusing on the behaviors of other traffic participants that have the potential to obstruct the behavior of the
185、ego vehicle(Figure 46).For instance,the turning of the vehicle in position 2 does not interfere with the ego vehicle;thus,it can be excluded from the safety analysis.The check mark in the figure indicates cases where the corresponding combinations of the surrounding vehicle positions and motions can
186、 potentially impact the driving of the ego vehicle,which must be considered in the safety analysis.4.1.1.4.Resulting traffic disturbance scenarios As a result of the systematization process discussed thus far,a methodology for structuring scenarios as a combination of the road geometry,the behavior
187、of the ego vehicle,and the position and motion of the surrounding vehicles is proposed herein.This structure consists of a matrix that contains 58 possible combinations in total(Figure 47).When limited to motorways as an example,there are three categories for the road geometry:“straight roads,”“merg
188、ing zones,”and“branching zones;”two categories for the ego-vehicle behavior:“going straight”and“lane change;”and two categories each(total four)for the positions and motions of the surrounding vehicles:“going straight(acceleration/deceleration)”and“lane change(cut-in/cut-out).”The motorway scenarios
189、 consist of a matrix with 24 possible combinations that could occur in a real traffic flow(Figure 48).Based on the similar accident categories,the sufficiency of these 58 cases,which cover all the dangerous cases that can lead to an accident,can be evaluated(Annex D).This matrix deals with comprehen
190、sive coverage of traffic disturbances resulting from interactions between two vehicles.The scenarios described here as traffic disturbance scenarios(Figures 47 and 48)are representative and must be able to consider a combination of the surrounding vehicle positions and behaviors that could obstruct
191、the ego vehicle(Figure 46).For example,Figure 49 presents the results of a scenario developed for Figure 48(Nos.5,32 6,7,and 8)where the road geometry consists of a single road,the ego vehicle behavior involves a lane change,and the motion of surrounding vehicles involves going straight and performi
192、ng a lane change.To elaborate on No.5,when the ego vehicle makes a lane change,cases where the surrounding vehicle is in front,situations in which the surrounding vehicle is in the front,the rear,or the side(i.e.,the vehicle in the front or rear is beside the ego vehicle)must be considered.The route
193、s that could lead to obstructions will differ when the number of lanes is different,even if the positions of the nearby vehicles remain the same.As a result,it is important to consider the positions of the surrounding vehicles and the number of lanes,as well as identify combinations of behaviors tha
194、t could obstruct the ego vehicle.33 Figure 47.Traffic disturbance scenarios for general vehicles Road sectorOn comingOn comingOn comingGoing straight(Lane keep)No1No2No3No4No5No6No7No8Lane changeNo9No10No11No12No13No14No15No16Going straight(Lane keep)No17No18No19No20No21No22Lane changeNo23No24No25No
195、26No27No28Going straight(Lane keep)No29No30No31No32No33No34Lane changeNo35No36No37No38No39No40Going straight(Lane keep)No41No42No43No44No45No46No47No48No49TurningNo50No51No52No53No54No55No56No57No58Surrounding traffic participants location and behaviourSubject-vehiclebehaviorGoing straightLane chang
196、e/SwervingTurningSame/Crossed(from R/L)directionSame/Crossed(from R/L)directionSame/Crossed(from R/L)directionRoad sector and subject-vehicle behaviournon-intersectionMerge zoneBranch zoneIntersectionSubject vehicleSurrounding vehicleSurrounding vehicle(+1)34 Figure 48.Traffic disturbance scenarios
197、for general vehicles on motorways 35 Figure 49.Scenarios with various combinations of positions of the surrounding vehicles and behaviors that could obstruct the ego vehicle 3636(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All
198、 rights reserved.All rights reserved.4.1.2.Scenarios unique to motorcycles In general,the categories of aforementioned positions and motions of surrounding vehicles(Figure 44)are applied to both four-wheeled vehicles and motorcycles.However,there are situations where motorcycles may drive in the nar
199、row space in the same lane as the ego vehicle,which requires additional safety evaluation scenarios.Because these scenarios only have the potential to occur in countries where such driving is legally allowed,an approach including detailed examples is shown in Annex B.4.1.3.Scenarios resulting from t
200、he combination of behaviours by several vehicles The proposed traffic disturbance scenario structure covers the relationship between the ego vehicle and one or two surrounding vehicles.However,in real traffic,multiple traffic participants take diverse actions at various moments.The current methodolo
201、gy covers these complex cases by extracting scenarios where the sudden motions by surrounding vehicles trigger the sequence of avoidance motions.By dividing these scenario types into a sequence of behaviours,multiple combinations of the positions and motions of the ego vehicle and the surrounding ve
202、hicles can be covered by safety analysis.Moreover,this can be realized by considering the influence of the road environment on the cut-in scenario by other vehicles that can potentially appear in this sequence.For instance,when the leading vehicle performs sudden deceleration(the first behaviour of
203、the sequence),the avoidance motion by the ego vehicle occurs(the second behaviour)and the ego vehicle retreats into the surrounding avoidance area.The detail of the approach to the complex scenarios that include detailed examples is included in Annex C.3737(C)Copyright Japan Automobile(C)Copyright J
204、apan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.4.2.Perception disturbance scenarios Perception disturbance scenarios include blind spot scenarios and connectivity disturbance scenarios,in addition to perception disturbances(Figure
205、50).Figure 50.Categories of perception disturbance scenarios 4.2.1.Perception disturbance scenarios Perception disturbance refers to a negative effect on perception performance during a situation in which the automatic driving system detects objects.The perception disturbance scenario is generated b
206、y disturbance-triggering factors and based on the principle of the sensors where disturbance occurs.While the factors of disturbances are diverse,it is possible to select the scenario group that contains the perception disturbance overall by classifying the factors based on the generation principle
207、and then selecting a representative factor among those in the same category.Moreover,by considering the necessary combinations based on the generation principle of each disturbance factor,it is possible to create a perception disturbance combination evaluation scenario.In this study,the disturbance
208、scenarios of three types of sensors,namely,millimetre wave radar,LiDAR and camera(Figure 51).3838(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Figure 51.Scenario derivation process based
209、on perception disturbance factors and sensor principle 4.2.1.1.Perception disturbance factors The factors of perception disturbance can be broadly classified into“vehicle/sensor,”“surrounding environment”and“perception target”in relation to the ego vehicle,which are then broken down and comprehensiv
210、ely classified at each layer to compose the perception disturbance factors system.Here,e.g.,a factor is broken down from the perspectives of structure,relative position and types,and continues to be categorized to layers such as colour,shape,material and behaviour.3939(C)Copyright Japan Automobile(C
211、)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Figure 52.Broad categories of perception disturbance factors according to the positional relationship with the ego vehicle Figure 53.System diagram of perception disturbanc
212、e factors 4040(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.4.2.1.1.1.Perception Disturbance Factors:Vehicle/Sensor The perception factors classified into“vehicle/sensor”are divided into
213、three categories according to the positions of these factors,namely,“a.ego vehicle”,“b.sensor”and“c.in front of the sensor”.Figure 54.Vehicle/sensor categories Tables 35 show the details of the perception disturbance factors categorized into a,b and c.These tables describe the detailed categorizatio
214、n,impact on the perception performance,and the generation principle of perception disturbance of the perception disturbance factors for each sensor.Table 3.“a.Ego Vehicle”disturbance factors 4141(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociat
215、ion,Inc.,Inc.,All rights reserved.All rights reserved.Table 4.“b.Sensor”disturbance factors Table 5.“c.In front of sensor”disturbance factors 4.2.1.1.2.Perception disturbance factors:Surrounding environment The perception factors classified into“surrounding environment”are divided into three categor
216、ies according to the characters of the objects existing around the ego vehicle,namely,“d.surrounding structure”,“e.space”and“f.surrounding moving objects”.“d.Surrounding structure”is further divided 4242(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers Association
217、Association,Inc.,Inc.,All rights reserved.All rights reserved.into the following three categories:“d-1.road surface”,“d-2.structure by the road”and“d-3.structure above the road”.Figure 55.Surrounding environment categories Tables 68 show detailed categorization,impact on the perception performance,a
218、nd the generation principle of perception disturbance of the perception disturbance factors classified into d-1,d-2,d-3,e and f.4343(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 6.“
219、d-1.Road surface”disturbance factors Table 7.“d-2.Structures by the road”disturbance factors 4444(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 8.“d-3.Structures above the road”distu
220、rbance factors Table 9.“e.Space”disturbance factors 4545(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 10.“f.Surrounding moving objects”disturbance factors 4.2.1.1.3.Perception Distu
221、rbance Factors:Perception Targets of Sensors The perception disturbance factors categorized as“perception targets of sensors”are broadly classified into“g.route”,“h.traffic information”,“j.obstacles”and“k.moving object”(Figure 56).Figure 56.Categories of perception targets of sensor “g.Route”is clas
222、sified into“g-1.lane maker”,“g-2.structure with height”and road edge as per the object that indicates a given place is a driving route.Moreover,road edge is divided further into g-3 and g-4 depending on whether there is a level difference or not(Figure 57).4646(C)Copyright Japan Automobile(C)Copyrig
223、ht Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Figure 57.Categories of“g.route”“h.Traffic information”is classified into“h-1.traffic light”,“h-2.traffic sign”and“h-3.road marking”as per their display style(Figure 58).Figure 58.
224、Categories of“h.traffic information”“j.Obstacle”is classified into“j-1.falling object”,“j-2.animal”and“j-3.installed object”according to whether it moves or not and the degree of impact when colliding with the vehicle(Figure 59).Figure 59.Categories of“j.obstacle”4747(C)Copyright Japan Automobile(C)
225、Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.“k.Moving objects”are classified into“k-1.other vehicles”,“k-2.motorcycle”,“k-3.bicycle”and“k-4.pedestrian”as per the type of traffic participant(Figure 60).Figure 60.Catego
226、ries of“k.moving objects”Tables 1114 show the detailed categorization,impact on the perception performance,and the generation principle of perception disturbance for the perception disturbance elements classified into g-1 to k-4,respectively.Table 11.“g-1.Lane marker”disturbance elements 4848(C)Copy
227、right Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 12.“g-2.Structure(with height)”disturbance elements Table 13.“g-3.Road edge without level difference”disturbance elements Table 14.“g-4.Road e
228、dge with a step”disturbance elements 4949(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 15.“h-1.Traffic lights”disturbance elements Table 16.“h-2.Traffic sign”disturbance elements Ta
229、ble 17.“h-3.Road marking”disturbance elements Table 18.“j-1.Falling object”disturbance elements 5050(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 19.“j-2.Animal”disturbance elements
230、 Table 20.“j-3.Installation object”disturbance elements Table 21.“k-1.Other vehicles”disturbance elements 5151(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 22.“k-2.Motorcycle”distur
231、bance elements Table 23.“k-3.Bicycle”disturbance elements Table 24.“k-4.Pedestrian”disturbance elements 4.2.1.2.Generation principle of sensor perception disturbance The sensor can potentially experience perception disturbance when detecting objects because of the factors discussed in the preceding
232、section.While the principle of perception disturbance generation is different for each sensor,they can be categorized as per the following common perspectives.5252(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserve
233、d.All rights reserved.The sensor disturbance principles are classified into“those occurring due to perception processing”,“those occurring due to cognitive processing”and“others”.The disturbances occurring because of perception processing are classified into those related to the signal from the perc
234、eption target(S)and those that hinder the signals from the perception target(noise N,unnecessary signal U).List the disturbances that can occur on signals individually related to S,N and U.The examples of categories of generation principles of perception disturbances that could occur on each sensor
235、based on these perspectives are as follows.Generation principle of perception disturbance of millimetre-wave radar.The perception disturbances that occur on millimetre-wave radar includes those caused by the direction of the sensor,those occurring because of perception processing and those occurring
236、 because of cognitive processing(Figure 61).Figure 61.Categories of perception disturbances for millimetre-wave radar In particular,the physical quantities that characterize the signal S in perception processing of millimetre-wave radar are the following three:frequency,phase and strength(Figure 62)
237、.-Frequency:Problem with the signal frequency can be cited as a disturbance originating from the sensor itself.-Phase:There are cases where the direction the signal is arriving from changes and cases where the amount of propagation delay changes,and the changes in signal arrival direction are attrib
238、uted to reflection and refraction.5353(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.-Signal strength:The conceivable situations include partial signal loss,a signal that is too strong,a l
239、arge difference in signal strengths,and the signal being too weak.Furthermore,possible disturbances in regard to the noise N and the unnecessary signal S in perception processing include low S/N,low D/U(ratio of strength between the necessary signal D and unnecessary signal U)and increase of U.Figur
240、e 62.Generation principle of disturbance in millimetre-wave radar perception processing Generation principle of LiDAR perception disturbance The physical quantities that characterize the signal S in perception processing of LiDAR are the scan timing,strength,propagation direction and velocity.-Scan
241、timing:The time difference because of the movement of the ego vehicle leads to positional shifts in the overall space;moreover,the time difference caused by the movement of the perception target leads to its positional shift.-Strength:Phenomena include saturation,attenuation and shielding.-Propagati
242、on direction change:There are those caused by reflection and those caused by refraction.-Velocity:While it affects the arrival time of signals,there are no corresponding items in perception disturbance of LiDAR.Furthermore,the noise N and unnecessary signal U include reflection and refraction from o
243、bjects other than the perception target,in addition to DC noise,pulse-like noise and multiple reflections(Figure 63).5454(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Figure 63.Generation
244、 principle of disturbance at perception of LiDAR Generation principle of perception disturbance at the camera The physical quantities that characterize the signal S in perception processing of the camera are the strength,direction/range signal change and acquisition time.-Strength:There are cases wh
245、ere the signal is too weak,the signal is too strong,the difference in signal strength is large and the signal is partially lost.-Direction/range:There are changes caused by refraction and changes caused by reflection.-Changes in the signal S.-Acquisition time:The possible cases of disturbances cause
246、d by blinking of the perception target and changes in relative positions include flickering and image blur/deletion.5555(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Furthermore,the noise
247、 N and unnecessary signal U include low D/U and low S/N(Figure 64).Figure 64.Generation principle of disturbance in camera perception 5656(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Sce
248、nario selection through cross-checking of perception disturbance elements and generation principle The relationship between the elements of perception disturbance at each sensor and the generation principles can be represented in the matrixes shown in Tables 2527.These matrixes list the perception d
249、isturbance elements vertically and generation principles horizontally,which makes it possible to understand the elements(=line)that can potentially cause the generation principle(=column).The several disturbance elements that can be reported in the same column are generated by the same principle.How
250、ever,from the perspective of a system safety evaluation,it is possible to select the elements whose degree of influence on the perception performance of each sensor and encounter probability in the market are high,as well as prioritize them as evaluation scenarios.When there are several elements tha
251、t have the equal priority,one or several elements are selected while taking the reproducibility of the evaluation environment of that scenario into account and evaluating the same.Moreover,when there are disturbance elements that do not match the given sensor among the items represented in the verti
252、cal axis because of the specifications of the ADS under evaluation(such as ODD and perception target),exclude them and select the representative scenario among the remaining elements.5757(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.
253、,Inc.,All rights reserved.All rights reserved.Table 25.Perception disturbance elements and generation principle matrix of millimetre-wave radar Small impactMedium impactGreat impactReflection(indirect wave)RefractionAliasingHarmonicLow S/N(change ofangle)Low S/N(attenuation atthe sensorsurface)Low S
254、/N(attenuation inspace)Low S/N(lowretroreflection)Low D/U(change ofangle)Low D/U(road surfacereflection)Low D/U(surroundingstructures)Low D/U(floatingobjects inspace)Low D/U(sensors onother cars)Low D/U(sensors onego cars)Increasing of U(change ofangle)Increasing of U(road surfacereflection)Increasi
255、ng of U(surroundingstructures)Increasing of U(floatingobjects inspace)Increasing of U(sensors onother cars)Increasing of U(sensors onego cars)Lack of points to beprocessedLack of calculatingabilityFalse detection of undesiredsignalNo detection of requiredsignalUnexpected distribution of point cloudU
256、nexpectedmovements(betweenframes)UnexpectedobjectsClassificationdiffuse reflectance,shape,positionrefraction range,misalignment,failure of sensoritselfpropagation delayrange,misalignment,failure of sensoritselfshape,positionretroreflectioncoefficient,targetposition,failure of sensoritselfretroreflec
257、tioncoefficient,targetposition,failure of sensoritselfretroreflectioncoeffieicent(RCS),3D shape,targetcombination,relative positionchange of angle,change of vehicleposture,roadgradient,misalignment,failure of sensortransmittivity,range,failure of sensoritseldattenuation rate inspaceretroreflectionco
258、effieicent(RCS),3D shape,targetcombination,relative positionchange of angle,change of vehicleposture,roadgradient,misalignment,failure of sensorretroreflectioncoeffieicent,diffuse reflectanceretroreflectioncoefficient,targetpositionretroreflectioncoefficient,attenuation rate inspacetype of sensors o
259、nother vehicle,positiontype of sensors,mounting position,surroundingenvironmentchange of angle,change of vehicleposture,roadgradient,misalignment,failure of sensorretroreflectioncoeffieicenretroreflectioncoefficient,targetpositionretroreflectioncoeffieicentype of sensors onother vehicle,positiontype
260、 of sensors,mounting position,surroundingenvironmentcaused by vehicle situation(semiparmanent)lack of tire pressure,etc.false positive,false negative8caused by vehicle situation(tempoal)change of load distribution inside a carfalse positive,false negative8degradation of sensor surface(a level of fau
261、lt detection failure)false positive,false negative6degradation of sensor itself(electric parts)(a level of fault detection failure)false positive,false negative6Lowering of electric perfoemance by exogeneous noise(a level of fault detection failure)false positive,false negative2misalignment(within a
262、djustment range)(failure of misalignment detection)false positive,false negative5misalignment(out of adjustment range)(untill detection of misalignment)false positive,false negative5water x homogeneousfalse negative1water x SPOT(drop)false positive,false negative3ice x evenfalse negative1ice x SPOT(
263、ice grain)false positive,false negative3snow x even(ex.after blizard)false negative1snow x SPOT(snow grain)false positive,false negative3dryclay/dirt x evenfalse negative1dryclay/dirt x SPOTfalse positive,false negative3wetclay/dirt x evenfalse negative1wetclay/dirt x SPOTfalse positive,false negati
264、ve3car washing wax x evenfalse negative1car washing wax x SPOTfalse positive,false negative3foreign materials(bug,droppings)x SPOTsticking of uneven bugs on the surfacefalse positive,false negative3broken surface of the sensorcrack,etc.false positive,false negative3broken surface of the sensorstrain
265、false positive,false negative3exchange of sensor surface material(variability after aiming)false positive,false negative3snow(a few)lowering of visibilityfalse positive,false negative3snow(a lot/blizard)bad visibilityfalse positive,false negative3snow(kicked up)partially low visibilityfalse positive
266、,false negative3rain(a few)lowering of visibilityfalse positive,false negative3rain(a lot)bad visibilityfalse positive,false negative3rain(kicked up)partially low visibilityfalse positive,false negative3sand(a few)lowering of visibilityfalse negative1sand(a lot)bad visibilityfalse negative1sand(kick
267、ed up)partially low visibilityfalse positive,false negative3fog(a little)lowering of visibilityfalse negative1fog(dense)bad visibilityfalse negative1othersfloating of kinds of seedsfalse positive,false negative3bugs(floating)swarming overfalse positive,false negative3direct x other vehicleother vehi
268、cle ego vehiclefalse positive,false negative2diret x infrastructureOrbis,etc.0direct x naturethe sun,etc.0diffracted wave x ego vehiclediffraction of other sensors on the ego vehiclefalse positive,false negative2rising slopefalse positive,false negative3descending slopefalse positive,false negative3
269、road with cantfalse positive,false negative3puddledifference of reflectance+concave regionfalse positive,false negative2iced roaddifference of reflectance+less bumpsfalse positive,false negative2fixed roadlineally,after fixing of convex regionfalse positive,false negative2rutconcave surface pararell
270、 to lane markersfalse positive,false negative2accumulated snowdifference of reflectance+a lot bumpsfalse positive,false negative2asphaltdefault,less bumpsfalse positive,false negative2concretedifference of reflectance,middle level of bumpsfalse positive,false negative2ballastdifference of reflectanc
271、e,a lot of bumpsfalse positive,false negative2sanddifference of reflectance,a lot of bumpsfalse positive,false negative2thin layerdifference of reflectance,less bumpsfalse positive,false negative2stone pavementdifference of reflectance,a lot of bumpsfalse positive,false negative2maintainance holedif
272、ference of reflectance,SPOTfalse positive,false negative2joint(metal)difference of reflectance,SPOTfalse positive,false negative2joint(asphalt)difference of reflectance,SPOTfalse positive,false negative2crash barrierfalse positive,false negative5buildingfalse positive,false negative5ridge railfalse
273、positive,false negative5road signage boardfalse positive,false negative4noise barrierfalse positive,false negative5rubber polefalse positive,false negative2ropefalse positive,false negative2boardfalse positive,false negative5roadside treesfalse positive,false negative2low treesfalse positive,false n
274、egative2grassfalse positive,false negative2buildingfalse negative1wallfalse negative1othersfalse negative1bridgefalse positive,false negative2tunnelfalse positive,false negative2buildingfalse positive,false negative2road signage boardfalse positive,false negative2mirrorfalse positive,false negative2
275、boardfalse positive,false negative2traffic lightfalse positive,false negative2traffic lightfalse negative1road signage boardfalse negative1information boardfalse negative1Reflectionfalse positive,false negative5color,materialfalse negative2large reflectionlarge signal intensityfalse positive,false n
276、egative3small reflectionsmall signal intensityfalse negative2dirtfalse negative2relative positionfalse negative2color,materialfalse negative2Shapefalse negative2dirtfalse negative2relative positionfalse negative2color,materialfalse negative2Shapefalse negative2dirtfalse negative2relative positionfal
277、se negative2color,materialfalse negative2Shape,sizefalse negative2relative position,motionfalse negative2Shape,sizefalse negative2relative position,motionfalse negative2color,materialfalse negative2large reflectionlarge signal intensityfalse positive,false negative3small reflectionsmall signal inten
278、sityfalse negative2dirtfalse negative2relative positionfalse negative2color,materialfalse negative2large reflectionlarge signal intensityfalse positivefalse negative3small reflectionsmall signal intensityfalse negative2Sticking objectsfalse negative2relative position,motionfalse negative2color,mater
279、ialfalse negative2Shape,sizefalse negative2Sticking objectsfalse negative2relative position,motionfalse negative2color,materialfalse negative2Shape,sizefalse negative2Sticking objectsfalse negative2relative position,motionfalse negative2color,materialfalse negative2Shape,sizefalse negative2relative
280、position,motionfalse negative261314612123981713368141991181419911566118118118118number ofitemsSignal from perception target(S)Signal from othersProcessing abilityProcessing performancePhaseClassification(Identification of target)Change of DOAChange ofpropagationdelayHigh intensityLarge differnceof s
281、ignalLow S/NDetection(Output of reflected point cloud of target)Clustering(grouping of reflected points)Tracking(tracking of target)Mounted place/statusIncreasing of UStrengthNoise(N)Undesired signal(U)Causal Factors of Perception DisturbancesPerception processVehicle motionEgo vehicleChange of vehi
282、clepostureClassification of Causal FactorsLow D/UNo signal(partial)ReflectionRecognition processOverheadobjectFront surface ofthe sensorFront surface of thesensorSticking objectsChanges incharacteristicsPropagation ofradio wave inspaceSpaceSpatial obstaclesRadio wave andlight in spaceCar/SensorSurro
283、unding environmentsRadarSensorFailure of sensoritselfPedestriansVariation inassenblyLaneReflectionScreenSurrounding movingobjectsStructurewith heightshapeRoad edgewithout stepRoad edgewith stepStructural objectsRoad surfaceShapeRoadsideobjectnumber of itemsRecognition targetsAnimalsInstallationobjec
284、tsShape,sizeOthervehiclesShape,sizeMotorbikesFallenobjectsEnvironment/TargetRoad conditionMaterialScreenObstruction on the laneMoving objectsBicyclesItemsVariableParameters 5858(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociation,Inc.,Inc.,All
285、rights reserved.All rights reserved.Table 26.Perception disturbance elements and generation principle matrix of LiDAR Small impactMedium impactS SpeedN factorClusteringTrackingClassificationGreat impactMisalignment ofoverall spatialpositionMisalignment ofposition ofrecognition targetSaturation of SA
286、ttenuation of SNo S due toocclusionReflectionRefractionArrival time of SNoiseMultiplereflectionsSignals not fromrecognition target(reflection)Signals not fromrecognition target(refraction)Insufficientnumber ofprocessing pointsInsufficientcomputingcapabilityIncorrectlydetects U(undesired signal)Fails
287、 to detect S(desired signal)Unexpecteddistributionof pointcloudbeing recognizedUnexpectedrecognitionbehavior(between frames)Unexpectedrecognition oftargetdue to vehicle condition(semi-permanent)misdetected/undetecteddue to vehicle condition(temporary)misdetected/undetectedaxial deviation(inside adju
288、stment range)misdetected/undetectedaxial deviation(outside adjustment range)misdetected/undetecteddegradation of sensor surfaceundetecteddegradation of sensor itself(electronic components)undetecteddegradation of electrical performance due to external noisemisdetected/undetectedwaterundetectediceund
289、etectedsnowundetectedmud/dustundetectedcar wash waxundetectedforeign matter(insects,bird droppings)x SPOTundetectedsensor surface damage(cracks)undetectedsensor surface damage(distortion)undetecteduphilldownhillroad cantpuddlemisdetectedfrozen road?misdetectedtraces of road repairrutsnow coverasphal
290、tconcretegravelsandthin layer pavementcobblestone roadmanholeroad joint(metal joint)road joint(asphalt type joint)Reflectioncurve mirrormisdetectedOcclusionundetectedReflectioncurve mirrormisdetectedOcclusionundetectedsnowmisdetected/undetectedrainmisdetected/undetectedsandmisdetected/undetectedfogm
291、isdetected/undetectedothers/floating in spacemisdetected/undetectedinsects/floating in spacemisdetected/undetecteddirect wave x other vehiclemisdetected/undetecteddirect wave x infra-structuremisdetected/undetecteddirect wave x nature worldmisdetected/undetectedReflectionmisdetectedColor/Materialsun
292、detectedShapesundetectedGrime/Thin spotundetectedRelative positionundetectedColor/MaterialsundetectedShapesundetectedGrimeundetectedRelative positionundetectedColor/MaterialsundetectedShapesundetectedGrimeundetectedRelative positionundetectedColor/MaterialsundetectedShapesundetectedGrimeundetectedRe
293、lative positionundetectedColor/Materialsmisdetected/undetectedShape/SizeundetectedRelative position/MotionColor/MaterialsundetectedShape/SizeundetectedRelative position/MotionColor/Materialmisdetected/undetectedShape/SizeundetectedGrimeundetectedRelative positionColor/Materialsmisdetected/undetected
294、Shape/SizeundetectedSticking objectsundetectedRelative positionColor/Materialsmisdetected/undetectedShape/SizeundetectedSticking objectsundetectedRelative positionColor/Materialsmisdetected/undetectedShape/SizeundetectedSticking objectsundetectedRelative positionColor/Materialsmisdetected/undetected
295、Shape/SizeundetectedRelative positionPerception ErrorRecognition ErrorSignals from recognition target(S)Signals not from recognition target(N,U)Processing capabilityProcessing performanceScan TimingS strengthS propagation directionU factorDetectionCauses of recognition errorvehicle,sensorEgo vehicle
296、Change ofvehicle poseSensorVariation ofinstallationFailure ofsensor itselfSurface in frontof the sensorSticking objectschanges incharaceristicsEnvironmentShapeRoad conditionMaterialMotor bikesBicyclesPedestriansSpatial obstaclesRadiowave andlight in spaceOther movingSpaceRoadsideobjectsOverheadobjec
297、tsStructural objectsRoadSurfaceOther vehiclesRecognition targetstrackLinesStructural objectswith heightRoad edgeswithout astepObstactions on the laneFallen objectAnimalsTemporal installedobjectwith a stepMoving objects 5959(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManuf
298、acturers AssociationAssociation,Inc.,Inc.,All rights reserved.All rights reserved.Table 27.Perception disturbance elements and generation principle matrix of the camera(element:vehicle/sensor,surrounding environment)Small effectMedium effectScatteringAbsorptionChromaLarge effectDisturbance outline(F
299、lare)(Randompattern)(Fixedpattern)(Flicker)ClippedwhitesColorsaturationTrackinglostTracking toanotherobjectDisturbanceCausal factor item(example)Refractionangle,area,position MinoreffectVignettingGhost(Lens)Reflection(WS)Amount ofnoiseColorvariationTransparencyvariationRelativepositionMotionFrequenc
300、yLateralmotionLongitudinal motionLightsourcebrightness,direction,colorReflectionratio andlight oftargetNot enoughbrightnessBrightnessin meteringzoneLightsourcevariationTargetcolorvariationHigh NLow SHigh ULow DCaused byvehicle sideCaused bytarget sideBlind areaToo nearto getfeatureOut ofFOVOutsidesc
301、ope oftargetNot inlearningdataReflectionSimilarfeatureType errorPositionerrorPosition(vertical)(lateral)Orientation(yaw)(roll)(pitch)LateralpositionVerticalpositionLateralpositionLongitudinalpositionFalsedetection(gettingclose)Falsedetection(drawingaway)Negativeerrorof relativespeedPositiveerror ofr
302、elativespeedTargetOut of targetfalse negativefalse positiveposition or velocity error-003-003-001-005-001-005Sensing-direction-Turn a curveTurning(especially small R like turning atintersection)1-002Sensing-direction-high speed straight aheadHigh speed straight ahead(especially with nearobject on th
303、e side of ego vehicle)1-001Ground height-SensingpositionReplacing tires-001Position-sensing positionImaging position shift(whole image shift)-001Direction-sensing directionImaging position shift(direction)-001Aging-lens transmittance(color change)Transparency variation of lens(yellow discoloration,e
304、tc.)-1000Operating environment-Temperature change-Degradation of CMOSsensor characteristicsSensor characteristics variation(temperature characteristics,etc.)1-100Operating environment-Temperature change-Degradation of lenscharacteristicsLens distortion variation(temperature characteristics,etc.)-100
305、0Pixel Defective-DefectivepixelsSmall object over defective pixelsMalfunctionwithout fault detection in case ofbadly defection-100Lens characteristic-Intra-lens reflectionDegradation of true detection or recognition ratiounder reflection caused by very high brightnesslight source.1-100Lens character
306、istic-ShadingDegradation of true detection or recognition ratiounder dark condition.(Especially periphery ofimage)1-100Processing capacity limit-Image complexityFail to detect or recognize a port of objectsbecause of too many targets.-100Processing capacity limit-Operating environmentToo many target
307、 objects under high temperature-100Hidden(Image Cut Out)-mud,dust,etc.Sticking mud,dust,etc.(image loss)-100Hidden(Image Cut Out)-snow,ice,etc.Sticking snow,ice,etc.(image loss)-100Hidden(Image Cut Out)-water,etc.Sticking water,etc.(image loss)-100Hidden(Image Cut Out)-insects,bird droppings,etc.Sti
308、cking insects,bird droppings,etc.(image loss)-100Hidden(Image Cut Out)-Windshield wiperWiper operation(image loss)-100Noise-mud,dust,etc.Sticking mud,dust,etc.(false detection,false recognition)(as image noise)-100Noise-snow,ice,etc.Sticking snow,ice,etc.(false detection,false recognition)(as image
309、noise)-100Moise-water,etc.Sticking water,etc.(false detection,false recognition)(as image noise)-100Noise-insects,birddroppings,etc.Sticking insects,bird droppings,etc.(false detection,false recognition)(as image noise)-100Noise-Windscreen wiperWiper operation(as noise on recognition target)-100Refr
310、actionWater drop(Rain drop,etc.)(as transparency object)1-100Aging-Transmittance(brightness variation)Transparency variation of windshield(include effect by stains)-1000Aging-Transmittance(colorvariation)Transparency variation of windshield(color spectrum variation)-1000Break-Crack-NoiseCrack on win
311、dshield,etc.-100Break-Crack-RefractionCurve variation of windshield1-002Product variation-Transmittance(brightnessvariation)Transparency variation of whole windshield-1000Product variation-Transmittance(colorvariation)Color variation of windshield-1000Product variation-RefractionCurve variation of w
312、indshield-10001-0101-100ShapeSlopeVariation of position and inclination of road surfaceas image-100Erased road line marker,Wheel track-011Shadow of guardrail,noise barrier,etc.-011Road mirage,icy pavement(wide area),waterscreen(when heavy rain)-011Road joints(bridge,material change ofpavement)-011Sp
313、ot on road surface-ReflectionPlash,icy pavement(partially),debris like mirror-011Entire road surface-ColorObject detection on colored pavement or coloredmaterials of pavement-100Entire road surface-Particle sizeCoarse(stone path)Medium(tiles(pattern)Fine(asphalt or concrete)-100Spot on road surface-
314、Installation objectFalse recognition caused by manhole cap,etc.-011Spot on road surface-Painted signFalse recognition as painted sign on road surface,etc.-011Mirror surfaceReflected image on traffic roadside mirror,etc.-011Non-mirror surfaceFalse recognition as different object like sign onroad side
315、-011Non-transparent materialImage cut out by roadside trees,buildings,roadsidesigns,etc.-100Transparent materialBox created by transparency materials(telephone box,bus station,etc.)-011ColorMajor background color(by buildings,signs,trees,etc.)is analogous to detection or recognition target.-100Shape
316、Interference shape of recognition target with shapeof background objects.False recognition of background object as person orobstruction-011Mirror surfaceN/A273000Non-mirror surfaceFalse recognition of overhead object-011Non-transparent materialBranches and trees of tall tree.Bridge-100Transparent ma
317、terialN/A273000ColorOverhead sign(Major color)-100ShapeInterference shape of recognition target with shapeof overhead sign.False recognition of object placed on down slopeahead of road-011Reflection-Mirror surfaceN/A273000Reflection-Non-mirrorsurfaceFalse recognition of reflective floating objects(l
318、ikeice,aluminum foil)as obstacle object.False recognition of patterns on smoke caused bylighting condition-011Hidden(Image Cut Out)-Non-transparency(rain,snow,etc.)Rain,snow,or fog before recognition target-100Hidden(Image Cut Out)-Non-transparency(sandstoms,petals blizzard,etc.)Sandstorms or petals
319、 blizzard before recognitiontarget-100Hidden(Image Cut Out)-Non-transparency(largeflying objects)Large flying object before recognition target-100Hidden(Image Cut Out)-TransparencyFlying transparent plastic bag-011Background-ColorSnow or fog(as background of recognition target)-100Background-ShapeDi
320、stribution profile of snow or fog-010Visible-Light source(point)-ColorStreet lamp,suns light,headlight of ego vehicle-010Visible-Light source(point)-Forward lightStreet lamp,suns light,headlight of ego vehicle no effect273000Visible-Light source(point)-BacklightLate afternoon sunheadlight from oncom
321、ing vehicle1-100Visible-Light source(point)-Reflected lightReflected image on surface of like water fromstreet light,suns light,headlight from oncomingvehicle,etc.-011Visible-Light source(environment)-ColorScattering light or ego vehicles headlight withbiased spectrum(visible)1-100Visible-Light sour
322、ce(environment)-brightness(bright)Strong scattering light(visible),Wildfire,under the searing sun1-100Visible-Light source(environment)-brightness(dark)Weak scattering light(visible)Ego vehicles headlight1-100Visible-Light source(environment)-brightness(bright+dark)Strong scattering light(visible)an
323、d shadow(searing sun and shady area,etc.)1-100Invisible-Disturbance lightsource(point)Infrared light projector,suns light-011Invisible-Disturbance lightsource(environment)Scattering light(near-infrared light)-100Mirror surfaceFalse recognition of reflected image on specularsurface of vehicle(like ta
324、nk truck)-011Non-mirror surfaceFalse recognition of reflected image of like light onpolished body-011Non-transparent materialParked vehicle,Roadside tree,Incoming flying object-100Transparent materialTransparent object(like glass case on loading platform)-011ColorFalse recognition because of similar
325、 target color tobackground color-100ShapeInterference shape of recognition target withbackground.(Impossible to separate recognition target frombackground because of the shape)-012Entire road surface-ReflectionMovingobjectsReflectionHidden(Image CutOut)Back-groundOverheadobjectsReflectionHidden(Imag
326、e CutBack-groundSpaceSpatial obstaclesRadio wave andlightEnvironmentsStructural objectsRoadsurfaceRoadconditionEntire road surface-NoiseMaterialRoadsideobjectsReflectionHidden(Image CutOut)Back-groundUsing tire chainsSensorsVariationSensoritselfFront of sensorsSticking objects,disturbingobjectsChara
327、cteristicsvariationReflection on windshieldReflected image of dashboard(include objects on dashboard)Ego vehicleRecognitionModelclassCausal factor groupEgo vehicle andsensorsChange of carpostureSensing-direction-Normal drivingAttitude modification by motion or load(includeimproper maintenance)Sensin
328、g-direction-Single vibrationBump overSensing-direction-Periodic vibrationTarget-position error(Edge detection error)size,lateral position,longitudinal position,ordirection errorDirection errorMagnitude errorPerception(Difficult to separatetarget fromcircumference)(Invisible)(Indetection)(False posit
329、ivedetection)(Classification error)Self-position errorRolling shutter effectCrushed shadowsOut of appropriateexposureW/B deviation(Hard to see)Target position errorTracking errorVelocity error(Blur:Depth of field)(Position shift,Deformation)(Vignetting)(Flare)(Ghost)(Double image)(Reflection)(Diffra
330、ction spike)(Aging)(Motion blur)Low spatial frequencyLow contrastHiddenNo classificationDetection or classification errorBase-position errorTime rag for exposureOver exposureUnder exposureLack of GradationDisturbance causal factorPerception partRecognition partOthersNumber of applicable itemsOpticsI
331、magerImage processingFeature extractionDetection and classificationBrightnessHuePositioningTrackingRefractionReflectionDiffractionNoiseColor filterExposure timeExposure periodCrushed shadows 6060(C)Copyright Japan Automobile(C)Copyright Japan Automobile ManufacturersManufacturers AssociationAssociat
332、ion,Inc.,Inc.,All rights reserved.All rights reserved.Table 28.Perception disturbance elements and generation principle matrix of the camera(element:perception target route/traffic information/obstacle)Small effectMedium effectScatteringAbsorptionChromaLarge effectDisturbance outline(Flare)(Randompa
333、ttern)(Fixedpattern)(Flicker)ClippedwhitesColorsaturationTrackinglostTracking toanotherobjectDisturbanceCausal factor item(example)Refractionangle,area,position MinoreffectVignettingGhost(Lens)Reflection(WS)Amount ofnoiseColorvariationTransparencyvariationRelativepositionMotionFrequencyLateralmotionLongitudinal motionLightsourcebrightness,direction,colorReflectionratio andlight oftargetNot enoughb