Show simple item record

dc.contributor.authorSchneider, Nick
dc.date.accessioned2026-03-17T14:46:10Z
dc.date.available2026-03-17T14:46:10Z
dc.date.issued2026
dc.identifier.issn1613-4214 (Online)
dc.identifier.urihttps://oapen-dev.siscern.org/handle/20.500.12657/109107
dc.description.abstractFusing camera and LIDAR data in autonomous driving poses challenges such as accurate calibration, differing data representations, and extensive training data requirements. This dissertation addresses these by three contributions: a deep neural network for LIDAR-to-camera calibration, two depth completion approaches for processing sparse depth measurements in the image space, and a large-scale dataset of 93k RGB and depth images for training and evaluating deep networks.
dc.languageEnglish
dc.relation.ispartofseriesSchriftenreihe / Institut für Mess- und Regelungstechnik, Karlsruher Institut für Technologie
dc.subject.classificationthema EDItEUR::T Technology, Engineering, Agriculture, Industrial processes::TG Mechanical engineering and materials::TGB Mechanical engineering
dc.subject.otherBildverstehen
dc.subject.otherComputer Vision
dc.subject.otherMachine Learning
dc.subject.otherNeural Networks
dc.subject.otherNeuronale Netze
dc.subject.otherSensor Fusion
dc.subject.otherSensorfusion
dc.subject.otherMaschinelles Lemen
dc.titleDeep Fusion of Camera and LIDAR
dc.typebook
oapen.identifier.doi10.5445/KSP/1000169933
oapen.relation.isPublishedBy44e29711-8d53-496b-85cc-3d10c9469be9
oapen.relation.isbn9783731513612
oapen.relation.isbn9783731513261
oapen.imprintKIT Scientific Publishing
oapen.series.number50
oapen.pages140
oapen.place.publicationKarlsruhe, Germany


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record