{"id":14899,"date":"2026-01-29T12:15:49","date_gmt":"2026-01-29T17:15:49","guid":{"rendered":"https:\/\/gf.com\/?p=14899"},"modified":"2026-01-29T12:16:47","modified_gmt":"2026-01-29T17:16:47","slug":"sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving","status":"publish","type":"post","link":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/","title":{"rendered":"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0"},"content":{"rendered":"\n<p>By&nbsp;<em>Yuichi Motohashi. Dep. Director \/ Global Segment Lead, Automotive Display, Camera, LiDAR &amp; SerDes, GlobalFoundries<\/em>&nbsp;&nbsp;<\/p>\n\n\n\n<p>Sense \u2013 analyze \u2013 act. This is the principle that advanced driver&nbsp;assistance&nbsp;systems (ADAS)&nbsp;operate&nbsp;on.&nbsp;Modern&nbsp;vehicles&nbsp;rely&nbsp;on a network of sensors&nbsp;to build a&nbsp;more&nbsp;precise, reliable&nbsp;perception&nbsp;of their surroundings.&nbsp;Sensor&nbsp;fusion combines these inputs&nbsp;\u2013 from&nbsp;<a href=\"https:\/\/gf.com\/blog\/the-sixth-sense-of-automotive-safety-how-radar-is-guiding-the-future-of-driving\/\">radar<\/a>, camera,&nbsp;LiDAR, and&nbsp;ultrasound \u2013&nbsp;with artificial intelligence and deep learning to deliver the&nbsp;environmental&nbsp;acuity&nbsp;required for&nbsp;vehicles to make&nbsp;split-second decisions.&nbsp;<\/p>\n\n\n\n<p>Since 1999, when&nbsp;<a href=\"https:\/\/gf.com\/gf-press-release\/gf-drives-progress-next-generation-automotive-radar\/\" target=\"_blank\" rel=\"noreferrer noopener\">Mercedes-Benz \u201ctaught the car to see,\u201d<\/a>&nbsp;radar&nbsp;has been&nbsp;a proven cornerstone of ADAS.&nbsp;However,&nbsp;camera and LiDAR technologies&nbsp;are&nbsp;rapidly advancing, adding&nbsp;new&nbsp;levels of&nbsp;detail and depth to a vehicle\u2019s&nbsp;perception.&nbsp;LiDAR in particular has long been stuck in the space between functional solutions and scalable manufacturing.&nbsp;GF is closing that gap, using&nbsp;FinFET, advanced packaging and photonics to unlock the path to mass-market viability.&nbsp;<\/p>\n\n\n\n<p>Together,&nbsp;complementary sensors&nbsp;provide high-resolution imagery, 3D mapping and object classification capabilities&nbsp;\u2013 each&nbsp;essential for&nbsp;the&nbsp;safer driving&nbsp;of&nbsp;today,&nbsp;and&nbsp;the&nbsp;fully autonomous mobility&nbsp;of&nbsp;tomorrow.&nbsp;<\/p>\n\n\n\n<p><strong>Cameras:&nbsp;Sharpening&nbsp;your&nbsp;car\u2019s&nbsp;view of the&nbsp;world<\/strong>&nbsp;<\/p>\n\n\n\n<p>Cameras capture high-quality images around cars to detect lane markings, speed limits, turn signals,&nbsp;pedestrians&nbsp;and more. Sophisticated algorithms analyze images taken by cameras to&nbsp;determine&nbsp;the distance, size, and speed of objects, enabling the system to react appropriately.&nbsp;<\/p>\n\n\n\n<p>Automotive cameras do not&nbsp;utilize&nbsp;ultra-high megapixel counts&nbsp;like mobile phones&nbsp;because&nbsp;additional&nbsp;pixels result in increased data for the vehicle\u2019s computer system to process. Producing extremely high-resolution images would significantly expand the volume of data transmitted to the central processor, potentially exceeding the capabilities of System on Chips (SoCs) that must&nbsp;analyze&nbsp;this information instantaneously to ensure safety. Excessive data could hinder processing speeds or overwhelm the system. Consequently, it is essential to carefully balance detection distance with the processing power required by the central&nbsp;SoC.&nbsp;<\/p>\n\n\n\n<p>The&nbsp;primary&nbsp;image quality&nbsp;Key&nbsp;Performance Indicator (KPI) is dynamic range, which is vital for&nbsp;maintaining&nbsp;accuracy in difficult lighting and weather conditions\u2014ranging from intense sunlight at dusk to darkness, heavy rainfall, or fog. Achieving such high dynamic range imaging&nbsp;necessitates&nbsp;increasingly sophisticated Read-out ICs (ROIC) within automotive stacked CMOS Image Sensors (CIS). There exists a direct relationship between system-level, circuit-level&nbsp;and transistor-level requirements for high-performance automotive CIS ROIC.&nbsp;<\/p>\n\n\n\n<p><strong>System-level&nbsp;<\/strong>&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Enhanced resolution (from 8MP to 12\u201316MP), frame rate (\u226530fps), and dynamic range (\u2265130dB) are necessary, collectively increasing the processing load on the ROIC.&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Transmission bandwidth of at least 6Gbps is essential, underscoring the need for SerDes integration.&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Long\u2011range detection depends on high pixel resolution, high-speed&nbsp;operation&nbsp;and minimal read noise (including 1\/f and RTS noise).&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Improved low\u2011light performance requires&nbsp;minimizing&nbsp;both ADC and transistor noise.&nbsp;<\/li>\n<\/ul>\n\n\n\n<p><strong>Circuit-level&nbsp;<\/strong>&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>To accommodate high bandwidth, circuits must achieve elevated clock speeds, low&nbsp;jitter&nbsp;and reduced noise.&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Die size limitations call for high capacitor density, robust transconductance (gm) and efficient logic cell area usage.&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Reliable functionality at temperatures up to 125\u00b0C demands low leakage characteristics.&nbsp;<\/li>\n<\/ul>\n\n\n\n<p><strong>Transistor-level&nbsp;<\/strong>&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>High-speed operation mandates transistors with superior Ft\/Fmax and low-noise characteristics.&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Consistent performance at elevated temperatures relies on effective leakage control and&nbsp;optimized&nbsp;transistor density.&nbsp;<\/li>\n<\/ul>\n\n\n\n<p>Images captured by vehicular cameras underpin many Advanced Driver-Assistance Systems (ADAS) features, such as lane departure warnings, collision avoidance and parking&nbsp;assistance, making them integral to contemporary automotive safety solutions. GF\u2019s advanced technology platform continues to&nbsp;facilitate&nbsp;the development of&nbsp;state-of-the-art&nbsp;automotive CIS solutions.&nbsp;<\/p>\n\n\n\n<p><strong>LiDAR: Mapping the&nbsp;roads&nbsp;in 3D<\/strong>&nbsp;<\/p>\n\n\n\n<p>If cameras are the car\u2019s eyes, LiDAR adds depth&nbsp;perception. Instead of 2D images, LiDAR emits laser pulses and measures their return to generate a 3D point cloud of the surroundings.&nbsp;<\/p>\n\n\n\n<p>By doing this, LiDAR&nbsp;generates&nbsp;a&nbsp;detailed 3D map of the world around your vehicle.&nbsp;This is&nbsp;ultimately&nbsp;how&nbsp;the car knows the difference between a pedestrian, a bicyclist, an animal, another&nbsp;car&nbsp;or a garbage can.&nbsp;Take&nbsp;<a href=\"https:\/\/techcrunch.com\/2025\/07\/30\/auroras-autonomous-trucks-are-now-driving-at-night-its-next-big-challenge-is-rain\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener\">Aurora<\/a>, the driverless commercial self-driving truck service. Its&nbsp;long-range lidar detects&nbsp;objects in the dark&nbsp;of night&nbsp;over&nbsp;450 meters&nbsp;away, even&nbsp;identifying&nbsp;objects as quickly as 11-seconds sooner than a traditional driver would.&nbsp;&nbsp;<\/p>\n\n\n\n<p>This&nbsp;precise&nbsp;3D vision&nbsp;powers today\u2019s ADAS features,&nbsp;like lane keeping, pedestrian detection and adaptive cruise control,&nbsp;and is&nbsp;laying the foundation for&nbsp;full&nbsp;self-driving functionality&nbsp;in the future.&nbsp;<\/p>\n\n\n\n<p><strong>Key&nbsp;figures of merit&nbsp;for automotive LiDAR&nbsp;systems<\/strong>&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Detection&nbsp;range&nbsp;and accuracy&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Long\u2011range LiDAR must exceed 300\u202fm detection distance.&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Field of View (FoV)&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Short\u2011range&nbsp;LiDAR&nbsp; Horizontal&nbsp;FoV&nbsp;target ~150\u00b0&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Vertical&nbsp;FoV: 20\u201330\u00b0&nbsp;&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Angular&nbsp;resolution:&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Long\u2011range: 0.1\u20130.15\u00b0&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Short\u2011range: 0.6\u00b0&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Distance&nbsp;resolution\/ranging&nbsp;accuracy&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Target improvement to around 5\u202fcm accuracy&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Frame&nbsp;rate&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Increased target: 30\u202ffps&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Point&nbsp;rate&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>dToF: Increase to ~10\u202fM pts\/sec&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>FMCW: Expected ~2\u202fM pts\/sec&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Power&nbsp;consumption&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>System-level power target: &lt;&lt;\u202f20\u202fW&nbsp;<\/li>\n<\/ul>\n\n\n\n<p><strong>How GlobalFoundries&nbsp;powers&nbsp;smarter&nbsp;sensors<\/strong>&nbsp;<\/p>\n\n\n\n<p>GF&nbsp;is at the forefront of advancing both camera and LiDAR technologies, delivering solutions that improve performance,&nbsp;integration,&nbsp;and efficiency.&nbsp;<\/p>\n\n\n\n<p>For&nbsp;camera,&nbsp;the image sensor is the core&nbsp;component&nbsp;that&nbsp;determines&nbsp;the performance of automotive cameras. GlobalFoundries delivers advanced Readout IC (ROIC) solutions for stacked CMOS Image Sensors (CIS),&nbsp;utilizing&nbsp;industry-leading 40nm and 22nm process nodes to meet the demanding requirements of next-generation automotive applications.&nbsp;40nm and 22nm platforms provide low-noise performance&nbsp;for analog circuits&nbsp;and&nbsp;low power consumption even under extreme automotive&nbsp;high temperatures.&nbsp;In particular,&nbsp;40nm-equipped&nbsp;image&nbsp;sensor has great image quality and high reliability, while&nbsp;22nm&nbsp;based&nbsp;platform&nbsp;also offers outstanding signal processing capabilities, low-power&nbsp;operations.&nbsp;Some&nbsp;of the benefits are:&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Higher resolution and improved&nbsp;dynamic&nbsp;range:<\/strong>&nbsp;GF&#8217;s solutions enable image sensors to capture higher resolution images with&nbsp;higher dynamic range, by enabling faster, low noise A\/D conversion with lower power&nbsp;consumption&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>System&nbsp;integration:&nbsp;<\/strong>Integrating&nbsp;essential components like memory,&nbsp;ISP&nbsp;(Image&nbsp;signal&nbsp;processor), analog and&nbsp;high-speed&nbsp;interface&nbsp;onto a&nbsp;single chip&nbsp;simplifies&nbsp;the complexity of ADAS.&nbsp;<\/li>\n<\/ul>\n\n\n\n<p>With cameras generating and processing high volumes of data, Serializer\/Deserializer&nbsp;technology converts data into a fast, streamlined stream, sends it over&nbsp;just single&nbsp;wires, and then converts it back for processing.&nbsp;GF is playing an active role in the&nbsp;OpenGMSL&nbsp;alliance and&nbsp;supporting SerDes-integrated smart sensors.&nbsp;<\/p>\n\n\n\n<p>For LiDAR,<strong>&nbsp;<\/strong>GF\u2019s silicon photonics on the&nbsp;45SPCLO platform&nbsp;could&nbsp;integrate&nbsp;laser&nbsp;source, light&nbsp;emitter,&nbsp;receiver&nbsp;and&nbsp;signal&nbsp;processing on a single chip, reducing&nbsp;LiDAR&nbsp;size and making it easier to fit into vehicles.&nbsp;Working with both O-band and C-brand wavelengths,&nbsp;the platform also uses a special silicon nitride (SiN) waveguide&nbsp;to&nbsp;achieve&nbsp;best-in-class propagation loss properties.&nbsp;<\/p>\n\n\n\n<p>In addition, GF\u2019s HP silicon germanium (SiGe) is the gold standard for image quality in&nbsp;high-performance&nbsp;LiDARs,&nbsp;and offers unparalleled response times for transimpedance&nbsp;amplifiers&nbsp;to process signals and detect objects faster.&nbsp;<\/p>\n\n\n\n<p>Advantages include:&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Miniaturization:<\/strong>&nbsp;Integrating multiple optical components onto one chip results in&nbsp;more&nbsp;cost-efficient, compact&nbsp;LiDAR systems.&nbsp;Developing&nbsp;highly integrated, true solid-state FMCW LiDAR&nbsp;results in lower manufacturing costs, making LiDAR more accessible.&nbsp;<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Electronics integration:&nbsp;<\/strong>Combining&nbsp;<a href=\"https:\/\/gf.com\/gf-press-release\/globalfoundries-acquires-advanced-micro-foundry-accelerating-silicon-photonics-global-leadership-and-expanding-ai-infrastructure-portfolio\/\">SiPh<\/a>&nbsp;with CMOS electronics enables&nbsp;enhanced&nbsp;signal processing for smarter, more capable sensors.&nbsp;<\/li>\n<\/ul>\n\n\n\n<p><strong>The&nbsp;rise of&nbsp;cameras and LiDAR to&nbsp;steer the&nbsp;future of&nbsp;autonomous&nbsp;driving<\/strong>&nbsp;<\/p>\n\n\n\n<p>Radar, cameras and LiDAR&nbsp;each shine on their own, but they&nbsp;need to work in concert&nbsp;when it comes to making cars smarter and safer. GF\u2019s technology sits at the heart of&nbsp;fusing these sensors,&nbsp;helping&nbsp;cars on the road&nbsp;to&nbsp;see farther, react&nbsp;quicker&nbsp;and make&nbsp;smarter decisions&nbsp;in the blink of an eye.&nbsp;&nbsp;<\/p>\n\n\n\n<p>While&nbsp;cameras&nbsp;and&nbsp;LiDARs&nbsp;are more emerging technologies in the automotive industry, there\u2019s massive potential to advance their performance and integration. GF is empowering automakers to accelerate the deployment of safer,&nbsp;smarter&nbsp;and more autonomous vehicles.&nbsp;<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div style=\"height:94px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<figure class=\"wp-block-image alignleft size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"232\" height=\"298\" src=\"https:\/\/gf.com\/wp-content\/uploads\/2025\/07\/7.-Yuichi-Motohashi-headshot.jpg\" alt=\"\" class=\"wp-image-14187\" style=\"width:138px;height:auto\"\/><\/figure>\n\n\n\n<p><strong>Author bio\u00a0\u00a0<\/strong><\/p>\n\n\n\n<p>Y<em>uichi Motohashi is the Deputy Director of End Markets at GlobalFoundries, responsible for leading the global segment in automotive cameras, LiDAR, SerDes and displays, which&nbsp;facilitate&nbsp;next-generation ADAS, autonomous driving and enhanced in-cabin experiences.<\/em>&nbsp;&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>By&nbsp;Yuichi Motohashi. Dep. Director \/ Global Segment Lead, Automotive Display, Camera, LiDAR &amp; SerDes, GlobalFoundries&nbsp;&nbsp; Sense \u2013 analyze \u2013 act. [&hellip;]<\/p>\n","protected":false},"author":33,"featured_media":14900,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[],"market_tags":[],"article_type":[2],"class_list":["post-14899","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","article_type-blog",""],"acf":[],"mb":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.1.1 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0 | GlobalFoundries<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0 | GlobalFoundries\" \/>\n<meta property=\"og:description\" content=\"By&nbsp;Yuichi Motohashi. Dep. Director \/ Global Segment Lead, Automotive Display, Camera, LiDAR &amp; SerDes, GlobalFoundries&nbsp;&nbsp; Sense \u2013 analyze \u2013 act. [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/\" \/>\n<meta property=\"og:site_name\" content=\"GlobalFoundries\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/GLOBALFOUNDRIES\" \/>\n<meta property=\"article:published_time\" content=\"2026-01-29T17:15:49+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-01-29T17:16:47+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2560\" \/>\n\t<meta property=\"og:image:height\" content=\"1440\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"GlobalFoundries\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@GLOBALFOUNDRIES\" \/>\n<meta name=\"twitter:site\" content=\"@GLOBALFOUNDRIES\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"GlobalFoundries\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/\"},\"author\":{\"name\":\"GlobalFoundries\",\"@id\":\"https:\/\/gf.com\/#\/schema\/person\/b650cffce29b8eefe81d37eec5f04be9\"},\"headline\":\"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0\",\"datePublished\":\"2026-01-29T17:15:49+00:00\",\"dateModified\":\"2026-01-29T17:16:47+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/\"},\"wordCount\":1618,\"publisher\":{\"@id\":\"https:\/\/gf.com\/#organization\"},\"image\":{\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg\",\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/\",\"url\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/\",\"name\":\"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0 | GlobalFoundries\",\"isPartOf\":{\"@id\":\"https:\/\/gf.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg\",\"datePublished\":\"2026-01-29T17:15:49+00:00\",\"dateModified\":\"2026-01-29T17:16:47+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage\",\"url\":\"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg\",\"contentUrl\":\"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg\",\"width\":2560,\"height\":1440,\"caption\":\"Following Aerial Top Down Drone View: Autonomous Self Driving Car Moving Through City Highway, Overtaking Other Cars. Visualization Concept: Sensor Scanning Road Ahead for Vehicles, Speed Limits\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/gf.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/gf.com\/#website\",\"url\":\"https:\/\/gf.com\/\",\"name\":\"GlobalFoundries\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/gf.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/gf.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/gf.com\/#organization\",\"name\":\"GlobalFoundries\",\"url\":\"https:\/\/gf.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/gf.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/gf.com\/wp-content\/uploads\/2022\/03\/cropped-logo.png\",\"contentUrl\":\"https:\/\/gf.com\/wp-content\/uploads\/2022\/03\/cropped-logo.png\",\"width\":512,\"height\":512,\"caption\":\"GlobalFoundries\"},\"image\":{\"@id\":\"https:\/\/gf.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/GLOBALFOUNDRIES\",\"https:\/\/x.com\/GLOBALFOUNDRIES\",\"https:\/\/www.linkedin.com\/company\/globalfoundries\",\"https:\/\/www.youtube.com\/@globalfoundries\",\"https:\/\/www.instagram.com\/globalfoundries.corporate\/\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/gf.com\/#\/schema\/person\/b650cffce29b8eefe81d37eec5f04be9\",\"name\":\"GlobalFoundries\",\"sameAs\":[\"http:\/\/gf.com\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0 | GlobalFoundries","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/","og_locale":"en_US","og_type":"article","og_title":"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0 | GlobalFoundries","og_description":"By&nbsp;Yuichi Motohashi. Dep. Director \/ Global Segment Lead, Automotive Display, Camera, LiDAR &amp; SerDes, GlobalFoundries&nbsp;&nbsp; Sense \u2013 analyze \u2013 act. [&hellip;]","og_url":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/","og_site_name":"GlobalFoundries","article_publisher":"https:\/\/www.facebook.com\/GLOBALFOUNDRIES","article_published_time":"2026-01-29T17:15:49+00:00","article_modified_time":"2026-01-29T17:16:47+00:00","og_image":[{"width":2560,"height":1440,"url":"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg","type":"image\/jpeg"}],"author":"GlobalFoundries","twitter_card":"summary_large_image","twitter_creator":"@GLOBALFOUNDRIES","twitter_site":"@GLOBALFOUNDRIES","twitter_misc":{"Written by":"GlobalFoundries","Est. reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#article","isPartOf":{"@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/"},"author":{"name":"GlobalFoundries","@id":"https:\/\/gf.com\/#\/schema\/person\/b650cffce29b8eefe81d37eec5f04be9"},"headline":"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0","datePublished":"2026-01-29T17:15:49+00:00","dateModified":"2026-01-29T17:16:47+00:00","mainEntityOfPage":{"@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/"},"wordCount":1618,"publisher":{"@id":"https:\/\/gf.com\/#organization"},"image":{"@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage"},"thumbnailUrl":"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg","inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/","url":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/","name":"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0 | GlobalFoundries","isPartOf":{"@id":"https:\/\/gf.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage"},"image":{"@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage"},"thumbnailUrl":"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg","datePublished":"2026-01-29T17:15:49+00:00","dateModified":"2026-01-29T17:16:47+00:00","breadcrumb":{"@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#primaryimage","url":"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg","contentUrl":"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-scaled.jpg","width":2560,"height":1440,"caption":"Following Aerial Top Down Drone View: Autonomous Self Driving Car Moving Through City Highway, Overtaking Other Cars. Visualization Concept: Sensor Scanning Road Ahead for Vehicles, Speed Limits"},{"@type":"BreadcrumbList","@id":"https:\/\/gf.com\/blog\/sensor-fusion-in-action-how-cameras-and-lidar-integrate-with-radar-for-safer-driving\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/gf.com\/"},{"@type":"ListItem","position":2,"name":"Sensor\u00a0fusion in\u00a0action: How\u00a0cameras and LiDAR\u00a0integrate with\u00a0radar for\u00a0safer\u00a0driving\u00a0"}]},{"@type":"WebSite","@id":"https:\/\/gf.com\/#website","url":"https:\/\/gf.com\/","name":"GlobalFoundries","description":"","publisher":{"@id":"https:\/\/gf.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/gf.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/gf.com\/#organization","name":"GlobalFoundries","url":"https:\/\/gf.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/gf.com\/#\/schema\/logo\/image\/","url":"https:\/\/gf.com\/wp-content\/uploads\/2022\/03\/cropped-logo.png","contentUrl":"https:\/\/gf.com\/wp-content\/uploads\/2022\/03\/cropped-logo.png","width":512,"height":512,"caption":"GlobalFoundries"},"image":{"@id":"https:\/\/gf.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/GLOBALFOUNDRIES","https:\/\/x.com\/GLOBALFOUNDRIES","https:\/\/www.linkedin.com\/company\/globalfoundries","https:\/\/www.youtube.com\/@globalfoundries","https:\/\/www.instagram.com\/globalfoundries.corporate\/"]},{"@type":"Person","@id":"https:\/\/gf.com\/#\/schema\/person\/b650cffce29b8eefe81d37eec5f04be9","name":"GlobalFoundries","sameAs":["http:\/\/gf.com"]}]}},"featured_image_src":"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-600x400.jpg","featured_image_src_square":"https:\/\/gf.com\/wp-content\/uploads\/2026\/01\/GettyImages-1809654178-600x600.jpg","author_info":{"display_name":"GlobalFoundries","author_link":"https:\/\/gf.com\/author\/euniceyuan\/"},"rbea_author_info":{"display_name":"GlobalFoundries","author_link":"https:\/\/gf.com\/author\/euniceyuan\/"},"rbea_excerpt_info":"By&nbsp;Yuichi Motohashi. Dep. Director \/ Global Segment Lead, Automotive Display, Camera, LiDAR &amp; SerDes, GlobalFoundries&nbsp;&nbsp; Sense \u2013 analyze \u2013 act. [&hellip;]","category_list":"<a href=\"?category=uncategorized\" rel=\"category tag\">Uncategorized<\/a>","comments_num":"0 comments","mfb_rest_fields":["title","yoast_head","yoast_head_json","featured_image_src","featured_image_src_square","author_info","rbea_author_info","rbea_excerpt_info","category_list","comments_num"],"_links":{"self":[{"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/posts\/14899","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/users\/33"}],"replies":[{"embeddable":true,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/comments?post=14899"}],"version-history":[{"count":3,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/posts\/14899\/revisions"}],"predecessor-version":[{"id":14903,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/posts\/14899\/revisions\/14903"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/media\/14900"}],"wp:attachment":[{"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/media?parent=14899"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/categories?post=14899"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/tags?post=14899"},{"taxonomy":"market_tags","embeddable":true,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/market_tags?post=14899"},{"taxonomy":"article_type","embeddable":true,"href":"https:\/\/gf.com\/wp-json\/wp\/v2\/article_type?post=14899"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}