| CPC G06F 3/011 (2013.01) [G06F 1/163 (2013.01); G06T 15/005 (2013.01); G06T 19/006 (2013.01); G06V 20/20 (2022.01)] |

| AS A RESULT OF REEXAMINATION, IT HAS BEEN DETERMINED THAT: |
| Claims 1, 13, 14, 17-19, 22-29, 31 and 32 are determined to be patentable as amended. |
| Claims 2-12, 15, 16, 20, 21 and 30, dependent on an amended claim, are determined to be patentable. |
| New claims 33-39 are added and determined to be patentable. |
|
1. An apparatus configured to be worn on a head of a user, comprising:
a screen configured to present graphics to the user;
a camera system configured to view an environment in which the user is located; and
a processing unit configured to determine a map
wherein the processing unit of the apparatus is also configured to obtain a metric indicating a likelihood of success to localize the user using the map, and wherein the processing unit is configured to obtain the metric by computing the metric or by receiving the metric [ ;
wherein the processing unit is configured to determine the map based on multiple images from the camera system, and wherein the metric is based on respective scores of the images and is a measure of a quality of the map, the metric being a separate data item and different from the respective scores of the images] .
|
|
13. The apparatus of claim 1, wherein the processing unit is configured to perform an optimization based on [ the ] images from the camera system, three-dimensional reference points, and a relative orientation between cameras of the camera system.
|
|
14. The apparatus of claim 1, wherein the processing unit is configured to determine
|
|
17. An apparatus configured to be worn on a head of a user, comprising:
a screen configured to present graphics to the user;
a camera system configured to view an environment in which the user is located; and
a processing unit configured to determine a map
wherein the processing unit of the apparatus is also configured to obtain a metric indicating a likelihood of success to localize the user using the map, and wherein the processing unit is configured to obtain the metric by computing the metric or by receiving the metric;
wherein the metric is for one of a plurality of cells, and each of the cells represents a three dimensional space of a portion of the environment [ ; and
wherein the processing unit is configured to determine the map based on multiple images from the camera system, and wherein the metric is based on respective scores of the images and is a measure of a quality of the map, the metric being a separate data item and different from the respective scores of the images] .
|
|
18. The apparatus of claim 17,
identifying
determining [ the ] respective scores for the images
summing the scores to obtain a total score.
|
|
19. The apparatus of claim 18, wherein the processing unit is also configured to determine
|
|
22. The apparatus of claim 18, wherein the processing unit is configured to determine each of the respective scores by determining a number of reference point(s) that is detected in the corresponding one of the images
|
|
23. An apparatus configured to be worn on a head of a user, comprising:
a screen configured to present graphics to the user;
a camera system configured to view an environment in which the user is located; and
a processing unit configured to determine a map
wherein the processing unit of the apparatus is also configured to obtain a metric indicating a likelihood of success to localize the user using the map, and wherein the processing unit is configured to obtain the metric by computing the metric
wherein the processing unit is configured to determine the metric by:
obtaining a plurality of images from the camera system; and
determining co-visibility values [ respectively for the images] , wherein each of the co-visibility values indicating a number of reference points detected in a corresponding one of the plurality of images [ ; and
wherein the processing unit is configured to determine the map based on the images from the camera system, and wherein the metric is based on the co-visibility values respectively for the images and is a measure of a quality of the map, the metric being a separate data item and different from the co-visibility values] .
|
|
24. An apparatus configured to be worn on a head of a user, comprising:
a screen configured to present graphics to the user;
a camera system configured to view an environment in which the user is located; and
a processing unit configured to determine a map
wherein the processing unit of the apparatus is also configured to obtain a metric indicating a likelihood of success to localize the user using the map, and wherein the processing unit is configured to obtain the metric by computing the metric or by receiving the metric;
wherein the processing unit is configured to perform a sanitization to remove or to disregard data that would otherwise provide an undesirable contribution for the map if the data is used to determine the map [ ; and
wherein the processing unit is configured to determine the map based on multiple images from the camera system, and wherein the metric is based on respective scores of the images and is a measure of a quality of the map, the metric being a separate data item and different from the respective scores of the images] .
|
|
25. The apparatus of claim 24,
|
|
26. The apparatus of claim 24, wherein the processing unit is configured to perform a bundle adjustment to adjust one or more rays associated with one or more [ of the ] images from the camera system, wherein the processing unit is configured to perform the bundle adjustment after performing the sanitization to remove the data.
|
|
27. An apparatus configured to be worn on a head of a user, comprising:
a screen configured to present graphics to the user;
a camera system configured to view an environment in which the user is located; and
a processing unit configured to determine a map
wherein the processing unit of the apparatus is also configured to obtain a metric indicating a likelihood of success to localize the user using the map, and wherein the processing unit is configured to obtain the metric by computing the metric or by receiving the metric;
[ wherein the processing unit is configured to determine the map based on multiple images from the camera system, and wherein the metric is based on respective scores of the images and is a measure of a quality of the map, the metric being a separate data item and different from the respective scores of the images;]
wherein the processing unit is configured to determine
wherein the processing unit is configured to perform data sanitization based on the
wherein the processing unit is configured to remove a constraint of the
|
|
28. An apparatus configured to be worn on a head of a user, comprising:
a screen configured to present graphics to the user;
a camera system configured to view an environment in which the user is located; and
a processing unit configured to determine a map
wherein the processing unit of the apparatus is also configured to obtain a metric indicating a likelihood of success to localize the user using the map, and wherein the processing unit is configured to obtain the metric by computing the metric or by receiving the metric;
wherein the processing unit is configured to determine the map by:
determining multiple map segments; and
connecting the map segments [ ; and
wherein the processing unit is configured to determine the map based on images from the camera system, and wherein the metric is based on respective scores of the images and is a measure of a quality of the map, the metric being a separate data item and different from the respective scores of the images] .
|
|
29. The apparatus of claim 28, wherein the processing unit is configured to determine a first map segment of the map segments by
|
|
31. The apparatus of claim 30, wherein the processing unit is configured to start the second map segment when the score [ of the additional image ] indicates that the [ additional ] image has a degree of constraint with respect to the first map segment that is below a threshold.
|
|
32. A method performed by an apparatus that is configured to be worn on a head of a user, the apparatus having a screen configured to present graphics to the user, a camera system configured to view an environment in which the user is located, and a processing unit, the method comprising:
obtaining, by the processing unit,
determining a map by the processing unit
obtaining, by the processing unit, a metric indicating a likelihood of success to localize the user using the map, wherein the act of obtaining comprises computing the metric or receiving the metric by the processing unit [ ;
wherein the map is determined by the processing unit based on the images from the camera system, and wherein the metric is based on respective scores of the images and is a measure of a quality of the map, the metric being a separate data item and different from the respective scores of the images] .
|
|
[ 33. The apparatus of claim 1, wherein the apparatus is configured to inform the user of the metric, and wherein the apparatus is configured to receive an input from the user to initiate a map generation session after the apparatus informs the user of the metric.]
|
|
[ 34. The apparatus of claim 17, wherein the apparatus is configured to inform the user of the metric, and wherein the apparatus is configured to receive an input from the user to initiate a map generation session after the apparatus informs the user of the metric.]
|
|
[ 35. The apparatus of claim 23, wherein the apparatus is configured to inform the user of the metric, and wherein the apparatus is configured to receive an input from the user to initiate a map generation session after the apparatus informs the user of the metric.]
|
|
[ 36. The apparatus of claim 24, wherein the apparatus is configured to inform the user of the metric, and wherein the apparatus is configured to receive an input from the user to initiate a map generation session after the apparatus informs the user of the metric.]
|
|
[ 37. The apparatus of claim 27, wherein the apparatus is configured to inform the user of the metric, and wherein the apparatus is configured to receive an input from the user to initiate a map generation session after the apparatus informs the user of the metric.]
|
|
[ 38. The apparatus of claim 28, wherein the apparatus is configured to inform the user of the metric, and wherein the apparatus is configured to receive an input from the user to initiate a map generation session after the apparatus informs the user of the metric.]
|
|
[ 39. The method of claim 32, further comprising informing the user of the metric, and receiving an input from the user to initiate a map generation session after the user is informed of the metric.]
|