US 11,812,048 B2
Image coding method and image coding apparatus
Sue Mon Thet Naing, San Jose, CA (US); Chong Soon Lim, Singapore (SG); Kyaw Kyaw Win, Singapore (SG); Hai Wei Sun, Singapore (SG); Viktor Wahadaniah, Singapore (SG); Takahiro Nishi, Nara (JP); Hisao Sasai, Osaka (JP); Youji Shibahara, Osaka (JP); Toshiyasu Sugio, Osaka (JP); Kyoko Tanikawa, Osaka (JP); Toru Matsunobu, Osaka (JP); and Kengo Terada, Osaka (JP)
Assigned to SUN PATENT TRUST, New York, NY (US)
Filed by Sun Patent Trust, New York, NY (US)
Filed on Jul. 27, 2022, as Appl. No. 17/874,468.
Application 17/874,468 is a continuation of application No. 17/124,659, filed on Dec. 17, 2020, granted, now 11,451,815.
Application 17/124,659 is a continuation of application No. 16/808,524, filed on Mar. 4, 2020, granted, now 10,904,554, issued on Jan. 26, 2021.
Application 16/808,524 is a continuation of application No. 16/407,540, filed on May 9, 2019, granted, now 10,623,762, issued on Apr. 14, 2020.
Application 16/407,540 is a continuation of application No. 16/014,260, filed on Jun. 21, 2018, granted, now 10,334,268, issued on Jun. 25, 2019.
Application 16/014,260 is a continuation of application No. 15/840,570, filed on Dec. 13, 2017, granted, now 10,034,015, issued on Jul. 24, 2018.
Application 15/840,570 is a continuation of application No. 15/471,097, filed on Mar. 28, 2017, granted, now 9,883,201, issued on Jan. 30, 2018.
Application 15/471,097 is a continuation of application No. 13/905,724, filed on May 30, 2013, granted, now 9,648,323, issued on May 9, 2017.
Application 13/905,724 is a continuation of application No. PCT/JP2013/000465, filed on Jan. 29, 2013.
Claims priority of provisional application 61/594,718, filed on Feb. 3, 2012.
Prior Publication US 2022/0360809 A1, Nov. 10, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. H04N 19/51 (2014.01); H04N 19/70 (2014.01); H04N 19/172 (2014.01); H04N 19/46 (2014.01); H04N 19/107 (2014.01); H04N 19/52 (2014.01); H04N 19/105 (2014.01); H04N 19/174 (2014.01); H04N 19/184 (2014.01)
CPC H04N 19/51 (2014.11) [H04N 19/105 (2014.11); H04N 19/107 (2014.11); H04N 19/172 (2014.11); H04N 19/46 (2014.11); H04N 19/52 (2014.11); H04N 19/70 (2014.11); H04N 19/174 (2014.11); H04N 19/184 (2014.11)] 3 Claims
OG exemplary drawing
 
1. An integrated circuit that executes operations comprising:
obtaining, from a header of a slice included in a first picture, a temporal motion vector prediction flag indicating whether or not temporal motion vector prediction is to be performed on the first picture;
judging, using the obtained temporal motion vector prediction flag, whether or not the temporal motion vector prediction is to be performed on the first picture, the temporal motion vector prediction using a temporal motion vector predictor derived from a motion vector of a co-located reference picture;
when said judging judges that the temporal motion vector prediction is to be performed on the first picture, (i) creating a first list of motion vector predictors that includes at least one temporal motion vector predictor derived from the motion vector of the co-located reference picture, (ii) obtaining a first parameter from a bitstream, the first parameter indicating a first motion vector predictor included in the first list, (iii) decoding the first picture using the first motion vector predictor indicated by the first parameter, and (iv) decoding a second picture following the first picture in decoding order by using the temporal motion vector prediction using the temporal motion vector predictor derived from the motion vector of the co-located reference picture preceding the first picture; and
when said judging judges that the temporal motion vector prediction is not to be performed on the first picture, (i) creating a second list of motion vector predictors that does not include the temporal motion vector predictor derived from the motion vector of the co-located reference picture, (ii) obtaining a second parameter from a bitstream, the second parameter indicating a second motion vector predictor included in second list, (iii) decoding the first picture using the second motion vector predictor indicated by the second parameter, and (iv) decoding the second picture by using the temporal motion vector prediction using the temporal motion vector predictor derived from a motion vector of the first picture and without using the motion vector of the co-located reference picture preceding the first picture,
wherein a number of the motion vector predictors included in the first list and a number of the motion vector predictors included in the second list are same.