US 11,805,988 B2
Endoscope system
Sho Shinji, Tokyo (JP)
Assigned to OLYMPUS CORPORATION, Tokyo (JP)
Filed by OLYMPUS CORPORATION, Tokyo (JP)
Filed on Dec. 1, 2020, as Appl. No. 17/108,216.
Application 17/108,216 is a continuation of application No. PCT/JP2018/021597, filed on Jun. 5, 2018.
Prior Publication US 2021/0106215 A1, Apr. 15, 2021
Int. Cl. A61B 1/00 (2006.01); A61B 1/045 (2006.01); A61B 1/06 (2006.01)
CPC A61B 1/045 (2013.01) [A61B 1/00006 (2013.01); A61B 1/000094 (2022.02); A61B 1/06 (2013.01); A61B 1/0605 (2022.02); A61B 1/0638 (2013.01)] 9 Claims
OG exemplary drawing
 
1. An endoscope system comprising:
an illumination portion that comprises an emitter and that is configured to radiate illumination light beam onto an imaging subject, the illumination light beam having intensity distribution in which light portions and dark portions are spatially repeated in a beam cross section perpendicular to an optical axis;
a controller configured to cause widths of the dark portions in the intensity distribution of the illumination light beam to change;
an imager configured to acquire a plurality of illumination images of the imaging subject being illuminated with illumination light beams in which the widths of the dark portions are different from each other; and
at least one processor comprising hardware, the processor being configured to:
create first separation images and second separation images from each of the plurality of illumination images, the first separation images containing a greater quantity of information about a deep layer of the imaging subject than the second separation images do; and
calculate information about depths of a feature portion in the imaging subject on the basis of the plurality of the first separation images and the plurality of the second separation images created from the plurality of illumination images,
wherein the processor is configured to:
create the first and second separation images on the basis of at least two types of intensity values among the intensity values of pixels that are in the illumination images and that respectively correspond to, in the intensity distribution, the light portions, the dark portions, and portions having intensity values that are between those of the light portions and those of the dark portions; and
calculate the information about the depths of the feature portion on the basis of changes among the plurality of first separation images and changes among the plurality of second separation images.