Equipped with a wide span of sensors, predominant autonomous driving
solutions are becoming more modular-oriented for safe system design. Though
these sensors have laid a solid foundation, most massive-production solutions
up to date still fall into L2 phase. Among these, Comma.ai comes to our sight,
claiming one 999aftermarketdevicemountedwithasinglecameraandboardinsideownstheabilitytohandleL2scenarios.Togetherwithopen−sourcedsoftwareoftheentiresystemreleasedbyComma.ai,theprojectisnamedOpenpilot.Isitpossible?Ifso,howisitmadepossible?Withcuriosityinmind,wedeep−diveintoOpenpilotandconcludethatitskeytosuccessistheend−to−endsystemdesigninsteadofaconventionalmodularframework.ThemodelisbriefedasSupercombo,anditcanpredicttheegovehicle′sfuturetrajectoryandotherroadsemanticsontheflyfrommonocularinput.Unfortunately,thetrainingprocessandmassiveamountofdatatomakealltheseworkarenotpubliclyavailable.Toachieveanintensiveinvestigation,wetrytoreimplementthetrainingdetailsandtestthepipelineonpublicbenchmarks.TherefactorednetworkproposedinthisworkisreferredtoasOP−Deepdive.ForafaircomparisonofourversiontotheoriginalSupercombo,weintroduceadual−modeldeploymentschemetotestthedrivingperformanceintherealworld.ExperimentalresultsonnuScenes,Comma2k19,CARLA,andin−houserealisticscenariosverifythatalow−costdevicecanindeedachievemostL2functionalitiesandbeonparwiththeoriginalSupercombomodel.Inthisreport,wewouldliketoshareourlatestfindings,shedsomelightonthenewperspectiveofend−to−endautonomousdrivingfromanindustrialproduct−levelside,andpotentiallyinspirethecommunitytocontinueimprovingtheperformance.Ourcode,benchmarksareathttps://github.com/OpenPerceptionX/Openpilot−Deepdive.