US 11,507,204 C1 (13,153rd)
Selecting virtual objects in a three-dimensional space
James M. Powderly, Ft. Lauderdale, FL (US); Savannah Niles, Ft. Lauderdale, FL (US); Frank Hamilton, Martinsburg, WV (US); Marshal A. Fontaine, St. Augustine, FL (US); Rony Abovitz, Weston, FL (US); and Alysha Naples, London (GB)
Filed by Magic Leap, Inc., Plantation, FL (US)
Assigned to MAGIC LEAP, INC.
Reexamination Request No. 90/019,786, Dec. 23, 2024.
Reexamination Certificate for Patent 11,507,204, issued Nov. 22, 2022, Appl. No. 17/454,793, Nov. 12, 2021.
Application 90/019,786 is a continuation of application No. 16/682,794, filed on Nov. 13, 2019, granted, now 11,175,750.
Application 16/682,794 is a continuation of application No. 15/296,869, filed on Oct. 18, 2016, granted, now 10,521,025.
Claims priority of provisional application 62/316,179, filed on Mar. 31, 2016.
Claims priority of provisional application 62/301,422, filed on Feb. 29, 2016.
Claims priority of provisional application 62/244,115, filed on Oct. 20, 2015.
Ex Parte Reexamination Certificate issued on Jan. 22, 2026.
Int. Cl. G06F 3/0346 (2013.01); G06F 1/16 (2006.01); G06F 3/01 (2006.01); G06F 3/04815 (2022.01); G06F 3/0482 (2013.01); G06F 3/04883 (2022.01)
CPC G06F 3/0346 (2013.01) [G06F 1/163 (2013.01); G06F 3/011 (2013.01); G06F 3/012 (2013.01); G06F 3/013 (2013.01); G06F 3/016 (2013.01); G06F 3/017 (2013.01); G06F 3/04815 (2013.01); G06F 3/0482 (2013.01); G06F 3/04883 (2013.01)]
OG exemplary drawing
AS A RESULT OF REEXAMINATION, IT HAS BEEN DETERMINED THAT:
Claims 1, 5 and 7 are determined to be patentable as amended.
Claims 2-4, 6 and 8-14, dependent on an amended claim, are determined to be patentable.
New claims 15-24 are added and determined to be patentable.
1. A method for selecting a virtual object located in three-dimensional (3D) space, the method comprising:
under control of an augmented reality (AR) system comprising computer hardware, the AR system configured to permit user interaction with interactable objects in a field of view (FOV) of a user:
determining a group of interactable objects in the FOV of the user;
identifying a target interactable object from the subgroup [ group ] of interactable objects based on a relative position of the target interactable object, wherein the relative position of the target interactable object comprises at least one of the following relative to the other objects of the subgroup [ group ] of interactable objects: closest to a midpoint of the user's FOV, [ one of a ] leftmost [ point ] in the user's FOV , and/or [ or a ] rightmost [ point ] in the user's FOV; and
initiating a selection event on the target interactable object.
5. The method of claim 1, further comprising receiving a selection of the target interactable object from the subgroup [ group ] of interactable objects.
7. The method of claim 1,
wherein receiving the selection of the target interactable object from the group of interactable objects comprises:
receiving a first input from a user device; and
in response to receiving the first input, identifying the target interactable object from the subgroup [ group ] of the interactable objects.
[ 15. A method for selecting a virtual object located in three-dimensional (3D) space, the method comprising:
under control of an augmented reality (AR) system comprising computer hardware, the AR system configured to permit user interaction with interactable objects in a field of view (FOV) of a user:
determining a group of interactable objects in the FOV of the user;
identifying a target interactable object from the group of interactable objects based on a relative position of the target interactable object, wherein the relative position of the target interactable object comprises at least one of the following relative to the other objects of the group of interactable objects: closest to a midpoint of the user's FOV, leftmost in the user's FOV, and/or rightmost in the user's FOV, wherein the target interactable object is automatically reoriented such that a surface of the target interactable object that initially does not face the user is oriented to face the user; and
initiating a selection event on the target interactable object.]
[ 16. A method for selecting a virtual object located in three-dimensional (3D) space, the method comprising:
under control of an augmented reality (AR) system comprising computer hardware, the AR system configured to permit user interaction with interactable objects in a field of view (FOV) of a user:
determining a group of interactable objects in the FOV of the user;
determining whether a current mode for interacting with the group of interactable objects is a first mode or a second mode, the first mode being different than the second mode;
identifying, in the first mode, a target interactable object from the group of interactable objects based on a relative position of the target interactable object, wherein the relative position of the target interactable object comprises at least one of the following relative to the other objects of the group of interactable objects: closest to a midpoint of the user's FOV, leftmost in the user's FOV, or rightmost in the user's FOV;
identifying, in the second mode, the target interactable object from the group of interactable objects; and
initiating a selection event on the target interactable object.]
[ 17. The method of claim 16, wherein the determining whether the current mode is the first mode or the second mode is at least partially based on a density of the group of interactable objects.]
[ 18. The method of claim 16, wherein the determining whether the current mode is the first mode or the second mode is at least partially based on a user input.]
[ 19. The method of claim 16, wherein the determining whether the current mode is the first mode or the second mode is at least partially based on a user preference.]
[ 20. The method of claim 16, wherein the determining whether the current mode is the first mode or the second mode is at least partially based on a user pose.]
[ 21. The method of claim 16, wherein the identifying, in the second mode, includes identifying the target interactable based on a user pose.]
[ 22. The method of claim 16, wherein the identifying, in the second mode, includes identifying the target interactable based on a user input.]
[ 23. The method of claim 16, further comprising:
displaying a first focus indicator associated with the target interactable object when the current mode is the first mode; and
displaying a second focus indicator associated with the target interactable object, different from the first focus indicator, when the current mode is in the second mode.]
[ 24. A method for selecting a virtual object located in three-dimensional (3D) space, the method comprising:
under control of an augmented reality (AR) system comprising computer hardware, the AR system configured to permit user interaction with interactable objects in a field of view (FOV) of a user:
determining a group of interactable objects in the FOV of the user;
identifying an initial target interactable object as a target interactable object from the subgroup of interactable objects based on a relative position of the target interactable object, wherein the relative position of the initial target interactable object comprises at least one of the following relative to the other objects of the subgroup of interactable objects: closest to a midpoint of the user's FOV, leftmost in the user's FOV, and/or rightmost in the user's FOV;
changing the target interactable object from the initial target interactable object to a further target interactable object within the group of objects in the FOV of the user and different than the initial target interactable object based on at least one of user input, user pose, or contextual information associated with the group of interactable objects in the FOV of the user, wherein the changing the target interactable object from the initial target interactable object to the further target interactable object is based on the user input as a hand gesture; and
initiating a selection event on the further target interactable object.]