PROPHESEE

PROPHESEE

Fabrication de semi-conducteurs

Metavision Technologies

À propos

Prophesee is the inventor of the world’s most advanced neuromorphic vision systems. The company developed a breakthrough Event-Based Vision approach to machine vision. This new vision category allows for significant reductions of power, latency and data processing requirements to reveal what was invisible to traditional frame-based sensors until now. Prophesee’s patented Metavision® sensors and AI mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance, and AR/VR. Prophesee is based in Paris, with local offices in Grenoble, Shanghai, Tokyo and Silicon Valley. The company is driven by a team of 130 visionary engineers, holds more than 50 international patents and is backed by leading international investors including Sony, iBionext, 360 Capital Partners, Intel Capital, Robert Bosch Venture Capital, Supernova Invest, and European Investment Bank.

Site web
https://www.prophesee.ai
Secteur
Fabrication de semi-conducteurs
Taille de l’entreprise
51-200 employés
Siège social
Paris
Type
Société civile/Société commerciale/Autres types de sociétés
Fondée en
2014
Domaines
Neuromorphic Engineering, Computer Vision, Image Sensors, Vision Systems, Analog and Mixed Signal Chip Design, Image Processing, Machine Vision, Autonomous Navigation, Robotics, AR/VR, IoT, AI et Machine Learning

Produits

Lieux

Employés chez PROPHESEE

Nouvelles

  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    💥 [BREAKING] Prophesee and AMD have teamed up to deliver an industry-first Event-based Vision solution running on the AMD Kria™ KV260 Vision AI Starter Kit. 🚀    Developers can now leverage Prophesee's Event-based Metavision® sensor and AI alongside AMD Kria's performance, power and speed to create advanced Edge AI machine vision applications.    Our collaboration marks the first Event-based Vision development kit compatible with an AMD platform, providing customers a robust tool for industrial-grade solutions in smart city infrastructure, machine vision, security cameras, retail analytics, and more. 👉 https://lnkd.in/e_HRMpjt   Come experience it live at our booth 3452 if you are at the Automate Show this week !   #automate2024 #EventBasedVision #EdgeAI #MachineVision #AMD #Kria #KV26 

    Prophesee delivers Event-based AMD Kria Starter Kit

    Prophesee delivers Event-based AMD Kria Starter Kit

    https://www.prophesee.ai

  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    [FR] 🚀 Prophesee et Bpifrance : Un Investissement de 15 Millions d'Euros pour l'IA Neuromorphique "Made in France"   Nous sommes fiers d'annoncer un investissement majeur de 15 millions d'euros pour le développement de l'IA neuromorphique en France ! 🎉 Le projet vise au développement d’une nouvelle génération de capteurs de vision neuromorphiques, fabriqués en France, destinés à l’IA embarquée, visant un marché total de 1,3 milliard de smartphones.   Cette initiative est cofinancé par Prophesee et Bpifrance dans le cadre du programme AAP IA Embarquée lancé par France 2030. Ensemble, grâce à un écosystème semiconducteur Français à la pointe, nous façonnons l'avenir de l'IA embarquée et renforçons notre leadership technologique mondial sur l’IA neuromorphique.   👉 https://lnkd.in/drm3k736 [EN] 🚀 Prophesee and Bpifrance: A €15 Million Investment for “Made in France” Neuromorphic AI We are proud to announce a major investment of €15 million for the development of neuromorphic AI in France! 🎉 The project aims to develop a new generation of neuromorphic vision sensors, manufactured in France, targeting embedded AI for a total market of 1.3 billion smartphones. This initiative is co-financed by Prophesee and Bpifrance as part of the AAP Embedded AI program launched by France 2030. Together, with a cutting-edge French semiconductor ecosystem, we are shaping the future of embedded AI and strengthening our global technological leadership in neuromorphic AI. 👉 https://lnkd.in/drm3k736 #Eventcamera #Eventbasedvision #Semiconductor #AI #Neuromorphic #Bpifrance #France2030 #MadeInFrance

  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    [BREAKING] The NEW Metavision® Starter Kit – AMD Kria KV260 is now available with GenX320! The recently announced industry’s first Event-based Vision development kit compatible with an AMD platform is now available with GenX320 - the world's smallest and most power-efficient event-based vision sensor. GenX320 features include: ✅ >120dB dynamic range ✅ 320x320px resolution ✅ Ultra-low power mode (down to 36μW)   This addition to the previously available IMX636 HD variant enables developers to work on an even broader range of advanced Edge AI machine vision applications with low-power requirements, such as: 🔹 Industrial Automation • High-speed counting and sizing • Inspection, Robot/AMR guidance • Machine Learning • Active marker-based 3D pose estimation • Preventative Maintenance 🔹 IoT edge devices and Smart Cities • People counting and tracking • Speed/Trajectory measurement • AI on-the-edge cameras   🚀 Ready to revolutionize your Edge AI computer vision projects? Get a quote 👉 https://lnkd.in/ehPpip-x   #machinevision #computervision #engineeringinnovation #eventcamera #evs #IoT #EdgeAI #industrialautomation #smartcity #lowpower

    • Aucune description alternative pour cette image
  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    🚀 Our award-winning GenX320 Metavision® sensor is enabling the next generation of edge and IoT technology! The GenX320 sensor is designed to deliver unparalleled efficiency, precision, and performance for consumer devices down to microwatt power levels in an ultra-compact 3x4mm format. Discover its key applications: 🔹 Eye Tracking: Achieve high-speed and low-latency eye tracking, perfect for enhancing user interfaces and virtual/augmented reality experiences. 🔹 Gesture Recognition: Enable accurate and real-time gesture recognition for more intuitive human-machine interactions in gaming, interactive displays, and more. 🔹 Object Detection and Tracking: Benefit from the sensor's ability to detect and track objects with exceptional speed and accuracy, ideal for surveillance, security, and industrial automation. 🔹 Fall Detection: Utilize the sensor’s capabilities for reliable and prompt fall detection, providing critical safety solutions while ensuring subject’s privacy in healthcare and assisted living environments. 🔹 Active Markers: Leverage active markers for high-speed tracking applications, ensuring robust performance even in challenging lighting conditions. 🔹 Inside Out Tracking: Enhance spatial awareness and navigation in VR/AR systems through precise inside out tracking, making immersive experiences more responsive and realistic. 🌟 Explore the limitless possibilities of the GenX320 sensor and discover how it can transform your applications with its advanced event-based Metavision technology 👉 https://lnkd.in/gW26Jv4Y #neuromorphic #evs #eventcamera #visionsensor #computervision #AR #VR #XR #IoT #EdgeAI

  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    🌟 We’re thrilled to return to VISION in Stuttgart this year! Don’t miss out – visit our booth to experience NEW exclusive demos from Prophesee and partners, featuring our cutting-edge event-based Metavision® technologies 🚀 Contact us to secure a meeting with experts of the world's most advanced neuromorphic vision systems 👉 https://lnkd.in/dbVkWsFF

    Voir la page d’organisation pour VISION, visuel

    4 827  abonnés

    🟡 𝗦𝗻𝗲𝗮𝗸 𝗣𝗿𝗲𝘃𝗶𝗲𝘄 | PROPHESEE🟡 Prophesee is the inventor of the world's most advanced neuromorphic vision systems. It's breakthrough Event-Based Vision can be applied to industrial automation, AR/VR, IoT and more. Prophesee's Metavision platform of pixel-intelligent sensors & AI enable greater performance (10,000 fps equivalent), power efficiency (microwatt consumption) & dynamic range (120dB) than traditional approaches. New products include the world's smallest & most power efficient event-based sensor GenX320, developed for integration into ultra-low-power Edge AI vision devices & the industry's first event-based vision development solution running on the FPGA-based AMD Kria KV260 Vision AI Starter Kit. ⏩ Want to find out more? Stop by booth 8B29 at VISION from 8 - 10 October 2024! ⏩ Get your ticket now: www.vision-fair.de/tickets Image rights: PROPHESEE #VISIONSTR #sensor #IoT #MachineVision #AI

    • Aucune description alternative pour cette image
  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    Thanks Karen Heyman Semiconductor Engineering for including our point of view on the topic of power management! Through biologically inspired principles of redundancy suppression, light contrast detection, and sparsity of data acquisition, event-based vision sensors are setting new power efficiency standards in computer vision applications. Learn more: https://lnkd.in/d5qFJ2yp #neuromorphic #eventcamera #eventsensor #evs #machinevision #computervision

    Voir la page d’organisation pour Semiconductor Engineering, visuel

    83 902  abonnés

    New Approaches Needed For Power Management

    New Approaches Needed For Power Management

    https://semiengineering.com

  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    [COMMUNITY] 🏆 Congratulations to the brilliant team behind the paper "EventPS: Real-Time Photometric Stereo Using an Event Camera" for receiving an Honorable Mention at CVPR 2024 🌟 This recognition places their research among the top 10 out of more than 11,500 submissions! 👁️🗨️ We're proud to share that this innovative research utilized our award-winning event-based Metavision® EVK4 HD camera. EventPS capitalizes on the exceptional temporal resolution, dynamic range, and low bandwidth characteristics of event cameras to enhance real-time 3D shape capturing. This breakthrough method estimates surface normals only from radiance changes, benefiting from event data sparsity, and has been demonstrated to run efficiently at over 30 fps in real-world scenarios. 👏 Kudos to the talented researchers Bohan Yu, Jieji Ren, Jin Han, Feishi Wang, Jinxiu Liang and Boxin Shi from Peking University, 上海交通大学, The University of Tokyo and National Institute of Informatics for this truly remarkable achievement. Read the paper here 👉 https://lnkd.in/eeRJ44mN 📢 To support academic advancements, we're offering 20% off Metavision® EVK4 for academics only, until July 15. Don't miss this opportunity to elevate your research – order now 👉 https://lnkd.in/eyMxVMn8 #eventcamera #evs #computervision #machinevision #eventbasedvision #Metavision #Innovation #CVPR2024 #CVPR

    EventPS: Real-Time Photometric Stereo Using an Event Camera

    openaccess.thecvf.com

  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    Thrilled to see our Metavision® Gen 3 sensor featured in this groundbreaking project by Davide Scaramuzza and Daniel Gehrig, published in Nature. Their innovative work highlights the promise of event-based vision to enhance safety and autonomy features in automotive applications. As Davide Scaramuzza explains, event-based vision strikes a crucial balance between bandwidth and latency, meeting the high safety standards in the automotive industry. Notably, they show that a 20-Hz RGB camera plus an event camera achieves the same latency as a 5,000-Hz camera with the bandwidth of a 50-Hz camera, i.e. an over 100-fold bandwidth reduction, without compromising accuracy. Read more below, and don't miss the excellent explainer by Ars Technica: https://lnkd.in/g984F24j Jacek Krywko

    Voir le profil de Davide Scaramuzza, visuel

    Associate Professor of Robotics and Perception at University of Zurich

    We are thrilled to share our groundbreaking paper published today in Nature: "Low Latency Automotive Vision with Event Cameras." Paper: https://lnkd.in/dFEtFsxU Video: https://lnkd.in/dAr4CX5g Code & Dataset: https://lnkd.in/djYW6mKH Frame-based sensors such as the RGB cameras used in the automotive industry face a bandwidth–latency trade-off: higher frame rates reduce perceptual latency but increase bandwidth demands, whereas lower frame rates save bandwidth at the cost of missing vital scene dynamics due to increased perceptual latency (see Fig. 1a of the paper). Event cameras have emerged as alternative vision sensors to address this trade-off. Event cameras measure the changes in intensity asynchronously, offering high temporal resolution and sparsity, markedly reducing bandwidth and latency requirements. Despite these advantages, event-camera-based algorithms are either highly efficient but lag behind image-based ones in accuracy or sacrifice the sparsity and efficiency of events to achieve comparable results. To overcome this, we propose a hybrid event- and frame-based object detector based on Deep Asynchronous GNNs, which preserves the advantages of each modality and thus does not suffer from this trade-off. Our method exploits the high temporal resolution and sparsity of events and the rich but low temporal resolution information in standard images to generate efficient, high-rate object detections, reducing perceptual and computational latency. In doing so, it emulates the slow-fast pathways in biological neural networks and uses them to its advantage. We show that using a 20-Hz RGB camera plus an event camera achieves the same latency as a 5,000-Hz camera with the bandwidth of a 50-Hz camera, i.e. an over 100-fold bandwidth reduction, without compromising accuracy. Our approach paves the way for efficient and robust perception in edge-case scenarios by uncovering the potential of event cameras. We release the code and the dataset (DSEC-Detection) to the public. Kudos to Daniel Gehrig, who, with this work, also received the UZH Annual Award for the Best PhD thesis! **Reference** Daniel Gehrig, Davide Scaramuzza Low Latency Automotive Vision with Event Cameras Nature, May 29, 2024. DOI: 10.1038/s41586-024-07409-w PDF (Open Access): https://lnkd.in/dFEtFsxU Video (Narrated): https://lnkd.in/dAr4CX5g Code & Datasets: https://lnkd.in/djYW6mKH   University of Zurich UZH Innovation Hub European Research Council (ERC) NCCR Robotics Switzerland Innovation Park Zurich University of Zurich Faculty of Science

    • Aucune description alternative pour cette image
  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    🔜 Join us at #CVPR24 in Seattle, US from June 19 to 21! 📅 Secure a meeting to get unparalleled insights for your computer vision projects with Event-Based Metavision® and see what others can't: https://lnkd.in/e-D8Y7Pg Come by our booth 1921 to experience firsthand the impact of Prophesee Metavision® systems on computer vision applications:   👉 Mobile Image Deblurring: Metavision sensor and AI running on Qualcomm Reference Design to natively cancel motion blur by capturing true motion information pixel-by-pixel. 👉 3D Structured Light: Ultra high-speed event-based depth measurement with up to 600Hz point clouds over USB, 1,152 individual laser points, and <1.5% RMSE error rate @500Hz. 👉 Active Markers on @AMD Kria™: Track LED markers at ultra-high speed with complete background rejection at pixel level, now on the FPGA-based KV260 Vision AI Starter Kit. 👉 GenX320 Advanced Features: Experience the world’s smallest and most power-efficient event-based vision sensor GenX320 through data streaming and preparation for AI, and experiment with our low end-to-end latency event processing on cost-optimized MCU-based platform. 👉 Particle Size Monitoring: Control, count and measure the size of objects moving at very high speed (up to 500,000 pixels / second) in a channel or a conveyor at 99% counting accuracy. See you there! #CVPR #neuromorphic #eventbasedvision #computervision #eventsensor #evs #eventcamera Computer Vision Foundation IEEE Computer Society

  • Voir la page d’organisation pour PROPHESEE, visuel

    10 398  abonnés

    [BREAKING] 🚀 Announcing a strategic partnership with Ultraleap and TCL RayNeo to deliver outstanding #AR glasses user experience. 🤝 This collaboration brings together Ultraleap's advanced hand-tracking technology, Prophesee’s event-based Metavision® sensing technology and TCL RayNeo's AR expertise in comprehensive AR hardware and software development, to create AR glasses that offer leading-edge interactivity and immersion. ⚡Ultraleap's technology together with Prophesee’s event-based Metavision offers unrivaled low power consumption and high speed, ensuring accurate tracking and natural interactions that replace buttons and capacitors. 🔔 Follow us for the latest updates on this project as we set new standards for immersive and intuitive experiences in smart AR wearables. Learn more 👉 https://lnkd.in/d_5jBbsK #eventbased #evs #machinevision #eventsensor #eventcamera #AR #SmartWearables #touchfree #handtracking

    • Aucune description alternative pour cette image

Pages similaires

Parcourir les offres d’emploi

Financement

PROPHESEE 7 rounds en tout

Dernier round

Série C

46 812 350,00 $US

Voir plus d’informations sur Crunchbase