Patents by Inventor Ivan Poupyrev

Ivan Poupyrev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12270898
    Abstract: Techniques and apparatuses are described that implement a smart-device-based radar system capable of determining characteristics of objects external to a vehicle, occupants within a vehicle, and objects proximal to an open-air vehicle. In particular, the system enables a smart device to perform many vehicle operations such as collision avoidance, occupant detection, and parking assistance in vehicle and open-air vehicle environments without integrated radar technology. By using a smart device to perform such actions, existing vehicles and open-air vehicles without integrated radar functionality may be able to leverage radar-based vehicle operations.
    Type: Grant
    Filed: October 17, 2019
    Date of Patent: April 8, 2025
    Assignee: Google LLC
    Inventors: Chih Yu Chen, YungSheng Chang, Ivan Poupyrev
  • Publication number: 20250094454
    Abstract: This application is directed to an integrated multimodal neural network driven by a natural language prompt. A computer system obtains sensor data from a plurality of sensor devices disposed in a physical environment during a time duration. One or more information items are generated to characterize one or more signature events detected within the time duration in the sensor data. The computer system obtains a natural language prompt. In response to the natural language prompt, the computer system applies a large behavior model (e.g., a large language model, a data processing model) to process the one or more information items and the natural language prompt jointly and generate a multimodal output (e.g., textual statements, software code, an image or video, an information dashboard having a predefined format, a user interface, and a heatmap). The multimodal output associated with the sensor data is represented.
    Type: Application
    Filed: August 26, 2024
    Publication date: March 20, 2025
    Inventors: Ivan Poupyrev, Brandon Barbello, Leonardo Giusti, Jaime Lien, Nicholas Edward Gillian
  • Publication number: 20250077508
    Abstract: This document describes techniques and devices for a radar recognition-aided search. Through use of a radar-based recognition system, gestures made by, and physiological information about, persons can be determined. In the case of physiological information, the techniques can use this information to refine a search. For example, if a person requests a search for a coffee shop, the techniques may refine the search to coffee shops in the direction that the person is walking. In the case of a gesture, the techniques may refine or base a search solely on the gesture. Thus, a search for information about a store, car, or tree can be made responsive to a gesture pointing at the store, car, or tree with or without explicit entry of a search query.
    Type: Application
    Filed: November 19, 2024
    Publication date: March 6, 2025
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Gaetano Roberto Aiello
  • Publication number: 20250068885
    Abstract: This application is directed to integrated multimodal neural networks. A computer system obtains sensor data from a plurality of sensor devices during a time duration, and the plurality of sensor devices include at least two distinct senor types and are disposed in a physical environment. One or more signature events are detected in the sensor data, and one or more information items are generated to characterize the one or more signature events detected in the sensor data, independently of the sensor types of the sensor devices. A large behavior model is applied to process the one or more information items and generate a multimodal output associated with the sensor data. The multimodal output describes the signature events associated with the sensor data in one of a plurality of predefined output modalities. The multimodal output is presented according to the one of the plurality of predefined output modalities.
    Type: Application
    Filed: August 26, 2024
    Publication date: February 27, 2025
    Inventors: Ivan Poupyrev, Brandon Barbello, Leonardo Giusti, Jaime Lien, Nicholas Edward Gillian
  • Publication number: 20250068860
    Abstract: This application is directed to compressing sensor data. A computer system obtains the sensor data from a plurality of sensor devices disposed in a physical environment during a time duration, and each sensor device corresponds to a temporal sequence of respective sensor samples. For each of the plurality of sensor devices, the temporal sequence of respective sensor samples is processed to generate an ordered sequence of respective sensor data features defining a respective parametric representation of the temporal sequence of respective sensor samples, independently of a sensor type of the respective sensor device. The computer system detects one or more signature events within the time duration based on the respective parametric representations of the plurality of sensor devices, and generates one or more information items characterizing the one or more signature events detected in the sensor data.
    Type: Application
    Filed: August 26, 2024
    Publication date: February 27, 2025
    Inventors: Ivan Poupyrev, Brandon Barbello, Leonardo Giusti, Jaime Lien, Nicholas Edward Gillian
  • Patent number: 12210086
    Abstract: Techniques and apparatuses are described that implement a smart-device-based radar system capable of performing location tagging. The radar system has sufficient spatial resolution to recognize different external environments associated with different locations (e.g., recognize different rooms or different locations within a same room). Using the radar system, the smart device can achieve spatial awareness and automatically activate user-programmed applications or settings associated with the different locations. In this way, the radar system enables the smart device to provide a location-specific shortcut for various applications or settings. With the location-specific shortcut, the smart device can improve the user's experience and reduce the need to repeatedly navigate cumbersome interfaces.
    Type: Grant
    Filed: October 13, 2020
    Date of Patent: January 28, 2025
    Assignee: Google LLC
    Inventors: Dongeek Shin, Ivan Poupyrev
  • Patent number: 12153571
    Abstract: This document describes techniques and devices for a radar recognition-aided search. Through use of a radar-based recognition system, gestures made by, and physiological information about, persons can be determined. In the case of physiological information, the techniques can use this information to refine a search. For example, if a person requests a search for a coffee shop, the techniques may refine the search to coffee shops in the direction that the person is walking. In the case of a gesture, the techniques may refine or base a search solely on the gesture. Thus, a search for information about a store, car, or tree can be made responsive to a gesture pointing at the store, car, or tree with or without explicit entry of a search query.
    Type: Grant
    Filed: October 26, 2023
    Date of Patent: November 26, 2024
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Gaetano Roberto Aiello
  • Publication number: 20240370096
    Abstract: This document describes techniques and systems for radar-based gesture-recognition with context-sensitive gating and other context-sensitive controls. Sensor data from a proximity sensor and/or a movement sensor produces a context of a user equipment. The techniques and systems enable the user equipment to recognize contexts when a radar system can be unreliable and should not be used for gesture-recognition, enabling the user equipment to automatically disable or “gate” the output from the radar system according to context. The user equipment prevents the radar system from transitioning to a high-power state to perform gesture-recognition in contexts where radar data detected by the radar system is likely due to unintentional input. By so doing, the techniques conserve power, improve accuracy, or reduce latency relative to many common techniques and systems for radar-based gesture-recognition.
    Type: Application
    Filed: July 17, 2024
    Publication date: November 7, 2024
    Applicant: Google LLC
    Inventors: Vignesh Sachidanandam, Ivan Poupyrev, Leonardo Giusti, Devon James O'Reilley Stern, Jung Ook Hong, Patrick M. Amihood, John David Jacobs, Abel Seleshi Mengistu, Brandon Barbello, Tyler Reed Kugler
  • Publication number: 20240369685
    Abstract: Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.
    Type: Application
    Filed: July 17, 2024
    Publication date: November 7, 2024
    Applicant: Google LLC
    Inventors: Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 12117560
    Abstract: This document describes apparatuses and techniques for radar-enabled sensor fusion. In some aspects, a radar field is provided and reflection signals that correspond to a target in the radar field are received. The reflection signals are transformed to provide radar data, from which a radar feature indicating a physical characteristic of the target is extracted. Based on the radar features, a sensor is activated to provide supplemental sensor data associated with the physical characteristic. The radar feature is then augmented with the supplemental sensor data to enhance the radar feature, such as by increasing an accuracy or resolution of the radar feature. By so doing, performance of sensor-based applications, which rely on the enhanced radar features, can be improved.
    Type: Grant
    Filed: August 4, 2021
    Date of Patent: October 15, 2024
    Assignee: Google LLC
    Inventors: Nicholas Edward Gillian, Carsten C. Schwesig, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 12111713
    Abstract: This document describes techniques and systems that enable a smartphone-based radar system for determining user intention in a lower-power mode. The techniques and systems use a radar field to enable the smartphone to accurately determine the presence or absence of a user and further determine the intention of the user to interact with the smartphone. Using these techniques, the smartphone can account for the user's nonverbal communication cues to determine and maintain an awareness of users in its environment, and only respond to direct interactions once a user has demonstrated an intention to interact, which preserves battery power. The smartphone may determine the user's intention by recognizing various cues from the user, such as a change in position relative to the smartphone, a change in posture, or by an explicit action, such as a gesture.
    Type: Grant
    Filed: February 9, 2022
    Date of Patent: October 8, 2024
    Assignee: Google LLC
    Inventors: Leonardo Giusti, Ivan Poupyrev, Eiji Hayashi, Patrick M. Amihood
  • Patent number: 12093463
    Abstract: This document describes techniques and systems for radar-based gesture-recognition with context-sensitive gating and other context-sensitive controls. Sensor data from a proximity sensor and/or a movement sensor produces a context of a user equipment. The techniques and systems enable the user equipment to recognize contexts when a radar system can be unreliable and should not be used for gesture-recognition, enabling the user equipment to automatically disable or “gate” the output from the radar system according to context. The user equipment prevents the radar system from transitioning to a high-power state to perform gesture-recognition in contexts where radar data detected by the radar system is likely due to unintentional input. By so doing, the techniques conserve power, improve accuracy, or reduce latency relative to many common techniques and systems for radar-based gesture-recognition.
    Type: Grant
    Filed: August 25, 2022
    Date of Patent: September 17, 2024
    Assignee: Google LLC
    Inventors: Vignesh Sachidanandam, Ivan Poupyrev, Leonardo Giusti, Devon James O'Reilley Stern, Jung Ook Hong, Patrick M. Amihood, John David Jacobs, Abel Seleshi Mengistu, Brandon Barbello, Tyler Reed Kugler
  • Patent number: 12085670
    Abstract: Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.
    Type: Grant
    Filed: May 4, 2023
    Date of Patent: September 10, 2024
    Assignee: Google LLC
    Inventors: Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 12073028
    Abstract: Techniques of identifying gestures include detecting and classifying inner-wrist muscle motions at a user's wrist using micron-resolution radar sensors. For example, a user of an AR system may wear a band around their wrist. When the user makes a gesture to manipulate a virtual object in the AR system as seen in a head-mounted display (HMD), muscles and ligaments in the user's wrist make small movements on the order of 1-3 mm. The band contains a small radar device that has a transmitter and a number of receivers (e.g., three) of electromagnetic (EM) radiation on a chip (e.g., a Soli chip. This radiation reflects off the wrist muscles and ligaments and is received by the receivers on the chip in the band. The received reflected signal, or signal samples, are then sent to processing circuitry for classification to identify the wrist movement as a gesture.
    Type: Grant
    Filed: February 24, 2023
    Date of Patent: August 27, 2024
    Assignee: GOOGLE LLC
    Inventors: Dongeek Shin, Shahram Izadi, David Kim, Sofien Bouaziz, Steven Benjamin Goldberg, Ivan Poupyrev, Shwetak N. Patel
  • Publication number: 20240094827
    Abstract: Systems and techniques are described for robust radar-based gesture-recognition. A radar system detects radar-based gestures on behalf of application subscribers. A state machine transitions between multiple states based on inertial sensor data. A no-gating state enables the radar system to output radar-based gestures to application subscribers. The state machine also includes a soft-gating state that prevents the radar system from outputting the radar-based gestures to the application subscribers. A hard-gating state prevents the radar system from detecting radar-based gestures altogether. The techniques and systems enable the radar system to determine when not to perform gesture-recognition, enabling user equipment to automatically reconfigure the radar system to meet user demand. By so doing, the techniques conserve power, improve accuracy, or reduce latency relative to many common techniques and systems for radar-based gesture-recognition.
    Type: Application
    Filed: November 28, 2023
    Publication date: March 21, 2024
    Applicant: Google LLC
    Inventors: Jung Ook Hong, Patrick M. Amihood, John David Jacobs, Abel Seleshi Mengistu, Leonardo Giusti, Vignesh Sachidanandam, Devon James O'Reilley Stern, Ivan Poupyrev, Brandon Barbello, Tyler Reed Kugler, Johan Prag, Artur Tsurkan, Alok Chandel, Lucas Dupin Moreira Costa, Selim Flavio Cinek
  • Publication number: 20240054126
    Abstract: This document describes techniques and devices for a radar recognition-aided search. Through use of a radar-based recognition system, gestures made by, and physiological information about, persons can be determined. In the case of physiological information, the techniques can use this information to refine a search. For example, if a person requests a search for a coffee shop, the techniques may refine the search to coffee shops in the direction that the person is walking. In the case of a gesture, the techniques may refine or base a search solely on the gesture. Thus, a search for information about a store, car, or tree can be made responsive to a gesture pointing at the store, car, or tree with or without explicit entry of a search query.
    Type: Application
    Filed: October 26, 2023
    Publication date: February 15, 2024
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Gaetano Roberto Aiello
  • Patent number: 11868537
    Abstract: Systems and techniques are described for robust radar-based gesture-recognition. A radar system detects radar-based gestures on behalf of application subscribers. A state machine transitions between multiple states based on inertial sensor data. A no-gating state enables the radar system to output radar-based gestures to application subscribers. The state machine also includes a soft-gating state that prevents the radar system from outputting the radar-based gestures to the application subscribers. A hard-gating state prevents the radar system from detecting radar-based gestures altogether. The techniques and systems enable the radar system to determine when not to perform gesture-recognition, enabling user equipment to automatically reconfigure the radar system to meet user demand. By so doing, the techniques conserve power, improve accuracy, or reduce latency relative to many common techniques and systems for radar-based gesture-recognition.
    Type: Grant
    Filed: April 29, 2022
    Date of Patent: January 9, 2024
    Inventors: Jung Ook Hong, Patrick M. Amihood, John David Jacobs, Abel Seleshi Mengistu, Leonardo Giusti, Vignesh Sachidanandam, Devon James O'Reilley Stern, Ivan Poupyrev, Brandon Barbello, Tyler Reed Kugler, Johan Prag, Artur Tsurkan, Alok Chandel, Lucas Dupin Moreira Costa, Selim Flavio Cinek
  • Patent number: 11841933
    Abstract: This document describes techniques and systems that enable radar-based authentication status feedback. A radar field is used to enable an electronic device to account for the user's distal physical cues to determine and maintain an awareness of the user's location and movements around the device. This awareness allows the device to anticipate some of the user's intended interactions and provide functionality in a timely and seamless manner, such as preparing an authentication system to authenticate the user before the user touches or speaks to the device. These features also allow the device to provide visual feedback that can help the user understand that the device is aware of the user's location and movements. In some cases, the feedback is provided using visual elements presented on a display.
    Type: Grant
    Filed: June 15, 2020
    Date of Patent: December 12, 2023
    Assignee: Google LLC
    Inventors: Leonardo Giusti, Ivan Poupyrev, Vignesh Sachidanandam, Johan Prag, Brandon Barbello, Tyler Reed Kugler, Alok Chandel
  • Publication number: 20230393665
    Abstract: Techniques of identifying gestures include detecting and classifying inner-wrist muscle motions at a user's wrist using micron-resolution radar sensors. For example, a user of an AR system may wear a band around their wrist. When the user makes a gesture to manipulate a virtual object in the AR system as seen in a head-mounted display (HMD), muscles and ligaments in the user's wrist make small movements on the order of 1-3 mm. The band contains a small radar device that has a transmitter and a number of receivers (e.g., three) of electromagnetic (EM) radiation on a chip (e.g., a Soli chip. This radiation reflects off the wrist muscles and ligaments and is received by the receivers on the chip in the band. The received reflected signal, or signal samples, are then sent to processing circuitry for classification to identify the wrist movement as a gesture.
    Type: Application
    Filed: February 24, 2023
    Publication date: December 7, 2023
    Inventors: Dongeek Shin, Shahram Izadi, David Kim, Sofien Bouaziz, Steven Benjamin Goldberg, Ivan Poupyrev, Shwetak N. Patel
  • Publication number: 20230367400
    Abstract: This document describes techniques for radio frequency (RF) based micro-motion tracking. These techniques enable even millimeter-scale hand motions to be tracked. To do so, radar signals are used from radar systems that, with conventional techniques, would only permit resolutions of a centimeter or more.
    Type: Application
    Filed: July 21, 2023
    Publication date: November 16, 2023
    Applicants: Google LLC, The Board of Trustees of the Leland Stanford Junior University
    Inventors: Jaime Lien, Erik M. Olson, Patrick M. Amihood, Ivan Poupyrev