#Lidar System AI | Artificial Intelligence powered Lidar System | Nasdaq Capital Market (OUST.WS)
#Ouster | High-resolution lidar sensors for long, mid, and short range applications | Zero Minimum Range: objects detected right up to sensor window | Picosecond-level accuracy, detection capabilities particularly at near range and on reflective surfaces | Return Sorting Options for dual return data with strongest-to-weakest, nearest-to-farthest, or farthest-to-nearest | Smart infrastructure | Autonomous machines | Robotics applications | Digital lidar architecture | Digital device powered by a fully custom silicon CMOS chip | Perimeter security | Traffic management solution | NEMA TS2 requirements as detection system for traffic actuation | 3D lidar sensors with edge AI capturing traffic data for active traffic actuation | Platform providing monitoring and visualization in real time | Delivering instant insights for traffic management
#Flyability | Drones for industrial inspection and analysis | Confined space inspection | Collision and crash resistant inspection drone | 3D mapping | Volumetric measurement | Inspections of cargo ships, bridges, ports, steel mills cement factories, liquid gas tanks, nuclear facilities, city wide underground sewage systems | Flying mobile scanner | Fits through openings as small as 50x50cm | Creates high resolution scans beyond line of sight | FARO Connect SLAM algorithm | Accurate 3D maps and digital twins of the most inaccessible spaces with centimeter precision | Comprehensive visual representations of challenging indoor spaces | Precise mapping and analysis | 100 m capacity to cover | 1,310,720 pts/sec | 340 m tunnel survey in underground mine | Fixed cage physically protects hardware | Drone recovers flight stability after collision | Survey Software (SS) combines multiple point clouds | SS georeferences point clouds | SS refines 3D models with various filters | SS outputs processed point cloud data to file types LAZ, LAS, PLY, TXT, and E57 | Drone payload: high resolution Ouster OS0-128 Rev 7 LiDAR sensor | Flammable gas sensor
#Blue White Robotics | Autonomous kit for existing farm tractors | Ultra high-precision navigation | Real-time situational awareness | Ouster 3D digital lidar
#OndoSense | Radar distance sensor | Sensor software: integrated into control system or used for independent quality monitoring | Object detection | Distance measurement | Position control | Agriculture: reliable height control of the field sprayer | Mining industry | Transport & Logistics | Shipping & Offshore | Mechanical and plant engineering | Metal and steel industry | Energy sector | Harsh industrial environments | Dust & smoke: no influence | Rain & snow: no influence | Radar frequency: 122GHz | Opening angle: ±3° | Measuring range: 0.3 – 40 m | Measuring rate: up to 100Hz | Output rate: up to 10 ms / 100 Hz | Measurement accuracy; up to ±1mm | Measurement precision: ±1mm | Communication protocol: RS485; Profinet, other interfaces via gateway | Switching output: 3x push-pull (PNP/NPN) | Analogue output: Current interface (4 – 20 mA) | Protection class: IP67
#Avikus | Autonomous intelligent navigation | Global shipping industry | Fast and secure operations within ports | Automated docking and undocking | Obstacle avoidance | HD Hyundai | Ouster OS sensor
#Optex | LiDAR Perimeter Security | Deploying invisible wall or plane, protecting buildings or assets | Intrusion detection | Accurate outdoor and indoor security sensors with LiDAR technology | Advanced detection capabilities | Long-range customisable detection zones | Enhanced environmental resistance | Creating virtual plane or wall to protect perimeters, buildings, and assets | Analysing size and distance of moving objects | Providing accurate point detection by tracking X & Y coordinates of moving objects | Defining type of intrusion based on object size | Differentiating between people, animals, vehicles, drones and more | Triggering right notifications | Up to 8 detection zones independently configurable allowing for multiple sensitivity scenarios | Integrated camera | -40 using a heated lens | Dynamic event filtering | Detection zone as two parts - judgment and alarm zonen| Intelligent logic to alarm based on detected vehicle size | Customise triggers | Virtual walls and planes
#Sebino | Lidar Perimeter Volumetric protection | Artificial intelligence and Deep Learning to distinguish humans from animals, vehicles, and drones | Defining area where only maximum number of people can enter | Via Enrico Mattei, 28, 24040 Madone (BG)
#Pinecone | Vector database market | Vector-search applications | Serverless vector database architecture | gen AI applications | generative AI ecosystem | generative AI tech stack | Monitoring for RAG apps | OpenLLMetry: extension of OpenTelemetry designed for LLMs and vector databases | Pinecone Systems, Inc 548 Market St San Francisco, CA 94104
#Forterra | AutoDrive | Driverless system | Autonomous vehicle system.| Vehicle-platform and payload agnostic | In-Vehicle, Remote or Garrison Oversight | Off-Road/On-Road | GPS-Denied Operation | Single-Vehicle Waypoint Route Navigation | Multi-Vehicle Convoys and Platooning | Static and Dynamic Obstacle Avoidance | Mission Re-Pathing | Retrotraverse and Reverse Platooning, with Trailers | Ouster lidar
#Leddar Sensor | LiDARs for mobility, ITS and industrial applications
#Velodyne | Lidar | Vision for autonomous mobile robots
#Ommatidia Lidar | Product: 3D Light Field Sensor | Channels: 128 parallel | Imaging vibrometry functionality | Target accuracy: 10µm | Measurement range: 0.5-50 m | Measurement accuracy (MPE): 20 + 6 μ/m | Angular range 30 x 360 | Vibrometry sampling frequenvy: 40 kHz | Vibrometry max in-band velocity: 15.5 mm/s | Power consumption: 45W | Battery operation time: 240 min | Interface: Ethernet | Format: CSV / VKT / STL / PLY / TXT | Dimension: 150x228x382 mm | Weight: 7,5 kg | Pointer: ~633 nm | Temperature range: 0/40 ºC | Environmental protection class: IP54 | Eye safety: Class 1M | Raw point clouds: over 1 million points | Calibration: metrology-grade with compensation of thermal and atmospheric effects | Geotagging: GPS | Output: 3D profiles of large objects
#Routescene | UAV lidar system
#Motional | Robotaxis | Ouster as the exclusive provider of long-range lidar sensors for its all-electric IONIQ 5-based robotaxis | Real-time 3D data up to 0.1-degree vertical and horizontal resolution with up to 300-meter range and 360° surround view | Over 30 sensors carefully integrated into the IONIQ 5 robotaxi design | IONIQ 5 robotaxis deployed for Motional’s commercial operations in Las Vegas and Los Angeles, and its testing operations in Boston, Pittsburgh, and Singapore
#LASE | Gantry crane outfitted with Ouster digital lidar sensors as part of the LASE solution to automate the handling of containers | Ouster OS1 sensors for precise position measurement and object detection to enable safer autonomous or semi-automated handling of containers | Collision avoidance | Optimal positioning and safe lifting of cargo on trucks | Ouster sensor for zone monitorin | Smart port market ~$2 billion to reach approximately $11 billion by 2030 | Over 800 container terminals around the world | Existing installations can be upgraded with 3D lidar | LASE won contract to supply 15 LaseLCPS-3D-2D (Load Collision Prevention System) systems for three container terminals in Italy | LaseLCPS-3D-2D system combines 3D/2D measuring systems for crane systems | It avoids collisions between containers in container stack in trolley travel direction with containers in operation bay and adjacent bays | Integrated soft-landing function ensures low-noise and wear-reduced container setting down | LASE won contract to supply a LaseBVH (Bulk Volume Heap) system industrial automation company in USA | LaseBVH—Bulk Volume Heap measuring system is high-precision 3D laser measuring system specially designed for measuring volume of bulk material stockpiles | Measurements are taken from stackers and/or reclaimers working in mining area or stockpiles | Depending on stockpile size and machine, one to two 3D laser scanners are installed on stacker/reclaimer | LASE Multifunctional Systems for port in Italy | LASE has been awarded the contract to supply two LaseYC-MF-2 systems | LaseYC-MF-2 is multifunctional laser measurement system based on multilayer laser scanner technology | .System is based on two multilayer laser scanners installed below portal girder above truck lane | Software connects sensors to LASE Control Unit (LCU) | Observing area truck lane makes it possible to cover following functions | Cabin PositionTruck Movement Detection | Truck Positioning | System Area Surveillance | Truck Operation
#Carbon Robotics | Autonomous LaserWeeder Robot | Deep Learning | Laser | Lidar Sensor | Nvidia | Cumming
#JAKA Robotics | Collaborative Robots for Inspection and Testing | Applications
#Vaisala | Industrial and Wheather Measurement | Lidar
#CHCNAV | Fully-Integrated Drone LIDAR+RGB Mapping System | Accurate 3D data collection | Capturing 3D reality in streamlined workflow | Simultaneous acquisition of point cloud data and high-resolution imagery | Capturing precise ground surfaces | Highly accurate Digital Elevation Models (DEMs) | Digital Surface Models (DSMs)
#Outsight | Transforming Raw 3D data from different manufacturers into actionable information | Software solutions to enable anonymous monitoring of people and vehicle flows in a variety of contexts such as airports and railway stations, shopping centres and sports facilities | Intelligent Transportation Systems benefiting from 3D LiDAR perception | Obtaining operational information: wsiting time in queue, safe distance detection | Autonomous car | Lidar | Object distance, size and volume calculation
#LeddarTech | Environmental sensing solutions for autonomous vehicles and ADAS
#Outrider | Autonomous system on zero-emission electric vehicle platform | Electric Autonomous Vehicles | Improving turnaround times per truck in distribution centers | Helping ease supply chain bottlenecks | Hitching and unhitching trailers | Monitoring trailer locations
#Third Wave | Hybrid autonomous vehicles and intelligent fleet management system | Autonomous vehicles in warehouses | Object detection in warehouse aisles | Automated material handling | Automating forklift | Ouster digital lidar sensor
#Luminar | Lidar technology
#Nissan | Fusion of LiDAR, camera and radar | Perception algorithm | Vehicle-control algorithm | 3D perception of vehicle surroundings | Reproducing space and objects | Recognizing scene context: vehicle category, road structure, traffic signs, characters | Recognizing motion of surrounding objects
#Plus ai | Open Platform for Autonomy (OPA) | Software platform | Converting sensor data from radar, lidar, and cameras into actionable intelligence | Building and updating maps to aid in navigation, perception, planning, and localization | OS: Linux, QNX | Luminar
#CHCNAV | Airborne LiDAR system | Corridor mapping and surveying | 2 million points per second | 15mm linear accuracy for long-range scans up to 150 m | Continuously rotating polygon mirror | 600 lines per second | 2 million pulses per second
#Hesai Technology | Lidars for autonomous vehicles
#SiLC Technologies | Frequency-modulated continuous wavy (FMCW) lidar.| Detecting vehicles and various obstacles from long distances | Detects tires at 150 m (492.1 ft.) away and a person in dark clothing at 300 m (984.2 ft.) | Fully qualified fabrication process | Proven, highly automated photonics assembly and manufacturing techniques | Full integration of all the photonics functions, including lasers and detectors, into a single chip | Utilizing frequency modulated continuous wave (FMCW) at the 1550nm wavelength | 1550 nanometer wavelength addresses eye safety regulatory concerns | Eyeonic Vision Sensor platform | Suite of development kits | Detailed point cloud visuals streamed over Ethernet | Ultra Long Range (ULR) Eyeonic Vision System
#Neptune Labs | neptune.ai | Tracking foundation model training | Model training | Reproducing experiments | Rolling back to the last working stage of model | Transferring models across domains and teams | Monitoring parallel training jobs | Tracking jobs operating on different compute clusters | Rapidly identifying and resolving model training issues | Workflow set up to handle the most common model training scenarios | Tool to organize deep learning experiments
#Genetec | Security Center Platform | Ouster Gemini integrated | Fusing lidar and video surveillance into a single interface | Unifying customer data from lidar, cameras and radar | Physical intrusion detection in real-time | Using 3D data to power automated detection, classification, tracking, and monitoring | Measuring the distance, trajectory or speed of people or vehicles | Quickly identifying and responding to real threats with confidence | Reducing false alarms
#Linux Foundation | LF AI & Data | Fostering open source innovation in artificial intelligence and data | Open Platform for Enterprise AI (OPEA) | Creating flexible, scalable Generative AI systems | Promoting sustainable ecosystem for open source AI solutions | Simplifying the deployment of generative AI (GenAI) systems | Standardization of Retrieval-Augmented Generation (RAG) | Supporting Linux development and open-source software projects | Linux kernel | Linus Torvalds
#outrider.ai | Automating yard operations for logistics hubs | Electric, zero-emission yard trucks | Advanced Testing Facility, which mimics distribution yards
#BrainChip | Ultra-low power, fully digital, event- based, brain-inspired AI | Low power acceleration co-processor | Enabling very compact, ultra-low power, portable and intelligent devices | Accelerates limited use case-specific neural network models | SDK for rapidly developing and deploying AI applications for Edge | Support for models created with TensorFlow/Keras and Pytorch | Event-based compute platform ideal for early detection, low-latency solutions | Development contract from Air Force Research Laboratory (AFRL) on neuromorphic radar signaling processing technologies | Mapping Complex Sensor Signal Processing Algorithms onto Neuromorphic Chips | Improving cognitive communication capabilities on size, weight and power & cost (SWaP-C) constrained platforms such as military, spacecraft and robotics | Embedding sophisticated radar processing solutions in SWaP-C constrained radar platforms | Low-power, high-performance computing in the most mission-critical use cases | Hardware and AI model using Temporal Enabled Neural Network (TENNs) model
#Ouster | BlueCity | Powering Lidar-Enabled Smart Traffic Solution | Traffic management solution in Chattanooga, Tennessee | Improvung roadway safety | Reducing congestion.| BlueCity solution to over 120 intersections | Combining digital lidar sensors and edge AI at each intersection | Managing traffic flow | Detecting and analyzing safety incidents | Providing detection for vehicle-to-everything (V2X) communications | Advanced perception software from Ouster | Lidar-powered smart traffic network | Optimizing traffic signal management on roads and intersections | Providing data to improve pedestrian safety | Intelligent signal actuation at intersections | Generating analytics data stream to traffic operators | Creating real-time 3D digital traffic twin of an intersection or road | Automating data collection in the cloud | Monitoring road events more accurately for vehicles, pedestrians and cyclists | Quick safety interventions | Long-term planning optimizations | Deep learning AI perception | Object classification | Object detection | Traffic actuation | Near-miss detection | Outside of crosswalk events | Red light running | Wrong-way driving | Southern Lighting & Traffic Systems | Center for Urban Informatics & Progress (CUIP) | University of Tennessee Chattanooga Research Institute (UTCRI) | Certified lidar traffic solution with Buy America(n) lidar
#NavVis | Point cloud processing workflow | Surveying | Laser scanning | Process laser scan data captured using NavVis technology from anywhere with internet connection | View and validate every control point | Ensure data privacy with fully automated and integrated blurring functionality | Point cloud presets | Environment-specific point cloud modes | Generate photorealistic point clouds automatically cleaned of dynamic objects while preserving original details and colors | Upload your geo-referenced control points file in global coordinate system | Automatically geo-register your point cloud | View control points exact location on quality map | Locate, verify, select/deselect control points to identify any potential errors before processing begins | Automated image anonymization process | Detecting and blurring individuals’ faces, bodies, and license plates in images and point clouds captured with NavVis devices | Confidently create and share projects that meet strict data confidentiality | Meet compliance standards, including GDPR requirements | Download comprehensive quality report | Detailed information on data accuracy
#Rijkswaterstaat | Dutch Ministry of Infrastructure and Water Management | Harbour Master of Rotterdam | Planning to develop a warning system based on lidar (3-D radar) | Inland vessels collide with Willemsbrug | Collisions receive a great deal of public attention occurring in densely populated urban area | Collisions involving bridges involve a certain level of risk | Royal Dutch Inland Shipping Association | Platform Zero Incidents | Rotterdam-Rijnmond Safety Region | Dutch Ministry of Defence | Dutch National Coordinator for Security and Counterterrorism (NCTV) | FERM Foundation
#Robotics & AI Institute | Collaborates with Boston Dynamics | Developed jointly Reinforcement Learning Researcher Kit for Spot quadruped robot | Developing sim-to-real for mobility | Transferring simulation results to real robotic hardware | Bridging sim-to-reality gap | Training policies generating a variety of agile behavior on physical hardware | Trying to achieve novel, robust, and practical locomotion behavior | Improving whole body loco-manipulation | Developing robot capability to manipulate objects and fixtures, such as doors and levers, in conjunction with locomotion significantly enhancing its utility | Exploring new policies to improve robustness in scenarios | Exploring full-body contact strategies | Exploring high-performance, whole-body locomotion and tasks that require full-body contact strategies, such as dynamic running and full-body manipulation of heavy objects, necessitating close coordination between arms and legs | Aiming to utilize reinforcement learning to generate behavior during complex contact events without imposing strict requirements | Develop technology that enables future generations of intelligent machines | Streamlining processes for robots to achieve new skills | Developing perception, situational understanding, reasoning, cognitive functions underpinning robot abilities and combining them with advances in their physical capabilities | Conducting research in four core areas: cognitive AI, athletic AI, organic hardware design, and ethics related to robotics
#UC Berkeley, CA, USA | Professor Trevor Darrell | Advancing machine intelligence | Methods for training vision models | Enabling robots to determine appropriate actions in novel situations | Approaches to make VLMs smaller and more efficient while retaining accuracy | How LLMs can be used as visual reasoning coordinators, overseeing the use of multiple task-specific models | Utilizing visual intelligence at home while preserving privacy | Focused on advancements in object detection, semantic segmentation and feature extraction techniques | Researched advanced unsupervised learning techniques and adaptive models | Researched cross-modal methods that integrate various data types | Advised SafelyYou, Nexar, SuperAnnotate. Pinterest, Tyzx, IQ Engines, Koozoo, BotSquare/Flutter, MetaMind, Trendage, Center Stage, KiwiBot, WaveOne, DeepScale, Grabango | Co-founder and President of Prompt AI