CPC Definition - Subclass H04N

Last Updated Version: 2024.01
PICTORIAL COMMUNICATION, e.g. TELEVISION
Definition statement

This place covers:

Television systems

  • Television systems, whether general or specially adapted for colour television
  • Details of television systems of general applicability, or specific to colour television, and also including scanning details of television systems
  • Coding, decoding, compressing or decompressing of digital video signals
  • Stereoscopic television systems, whether general or specially adapted for colour television, or details therefor
  • Selective distribution of pictorial content, in particular interactive television or video on demand [VOD]
  • Diagnosis, testing or measuring for television systems or details therefor
  • Transmission of pictures or their transient or permanent reproduction either locally or remotely, by methods involving both the following steps:
  • Step (a): the scanning of a picture, i.e. resolving the whole picture-containing area into individual picture-elements and the derivation of picture-representative electric signals related thereto, simultaneously or in sequence;
  • Step (b): the reproduction of the whole picture-containing area by the reproduction of individual picture-elements into which the picture is resolved by means of picture-representative electric signals derived therefrom, simultaneously or in sequence;
  • (In group H04N 1/00) Systems for the transmission or the reproduction of arbitrarily composed pictures or patterns in which the local light variations composing a picture are not subject to variation with time, e.g. documents (both written and printed), maps, charts, photographs (other than cinematograph films);
  • Circuits specially designed for dealing with pictorial communication signals, e.g. television signals, as distinct from merely signals of a particular frequency range.

Other pictorial communication

  • Scanning, transmission or reproduction of documents or the like, in particular facsimile transmission
  • Details pertaining to scanning, transmission or reproduction of documents or the like, in particular details of facsimile transmission
Relationships with other classification places

Subclass G09G covers arrangements for control of display devices characterised by the type of display device or its components, whilst H04N covers the acquisition and control of the picture signal which is intended for display.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Instruments for performing medical examinations of the interior of cavities or tubes of the body combined with television appliances

A61B 1/04

Arrangements of television sets in vehicles; Arrangement of controls thereof

B60R 11/02

Mounting of cameras operative during drive of a vehicle;

B60R 11/04

Arrangements of cameras in aircraft

B64D 47/08

Controlling or regulating single-crystal growth by pulling from a melt, using television detectors

C30B 15/26

Inspecting textile materials by television means

D06H 3/08

Scanning a visible indication of a measured value and reproducing this indication at a remote place, e.g. on the screen of a cathode-ray tube

G01D 5/39

Burglar, theft, or intruder alarms using television cameras

G08B 13/196

Structural combination of reactor core or moderator structure with television camera

G21C 17/08

Informative references

Attention is drawn to the following places, which may be of interest for search:

Video games, i.e. games using an electronically generated display having two or more dimensions

A63F 13/00

Systems for the reproduction according to the above-mentioned step (b) of pictures comprising alphanumeric or like character forms and involving the generation according to the above-mentioned step (a) of picture-representative electric signals from a pre-arranged assembly of such characters, or records thereof, forming an integral part of the systems

B41B, G06K

Printing, duplication or marking processes, or materials therefor

B41C, B41J, B41M, G03C, G03F, G03G

Systems for the reproduction according to step (b) of Note (1) of pictures comprising alphanumeric or like character forms but involving the production of the EQUIVALENT of a signal which would be derived according to the above-mentioned step (a), e.g. by cams, punched card or tape, coded control signal, or other means

G01D, G06T, H04L

Systems for the direct photographic copying of an original picture in which an electric signal representative of the picture is derived according to the said step (a) and employed to modify the operation of the system, e.g. to control exposure,

G03

Systems in which legible alphanumeric or like character forms are analysed according to step (a) of Note (1) to derive an electric signal from which the character is recognised by comparison with stored information

G06K

Image data processing or generation, in general

G06T

Circuits or other parts of systems which form the subject of other subclasses

H03C, H03F, H03J, H04H

Broadcasting

H04H

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

television systems

Systems for the transmission and reproduction of arbitrarily composed pictures in which the local light variations composing a picture MAY change with time, e.g. natural "live" scenes, recordings of such scenes such as cinematograph films

CCD

Charge-coupled device, that is, a device made up of semiconductors arranged in such a way that the electric charge output of one semiconductor charges an adjacent one

MPEG

Motion Picture Experts Group; a family of standards used for coding audio-visual information in a digital compressed format

NTSC

National Television System Committee

PAL

Phase alternating line

Picture signal generator

Circuits or arrangements receiving as input an image of a scene and delivering as output an electric signal that contains all the information required to reproduce the image of the scene

Picture reproducer

Circuits or arrangements receiving as input an electric signal characteristic of an image of a scene and producing as output a visual display of that image

SECAM

Séquentiel couleur à mémoire (Sequential Colour with Memory)

Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
Definition statement

This place covers:

  • systems for the transmission or the reproduction of arbitrarily composed pictures or patterns in which the local light variations composing a picture are not subject to variation with time, e.g. documents (both written and printed), maps, charts, photographs (other than cinematograph films);
  • transmission of time-invariant pictures, e.g. documents (both written and printed), maps, charts, photographs (other than cinematograph films), or their transient or permanent storage or reproduction either locally or remotely by methods involving both scanning and reproduction;
  • systems involving the generation, transmission, storage or reproduction of time-invariant pictures; image manipulation for such reproduction on particular output devices;
  • devices applied to the transmission, storage or reproduction of time-invariant pictures, e.g. facsimile apparatus, digital copiers, (digital) scanners, multifunctional peripheral devices;
  • circuits specially designed for dealing with pictorial communication signals, e.g. facsimile signals or colour image signals, as distinct from merely signals of a particular frequency range;
  • storage or transmission aspects of still video cameras.
Relationships with other classification places
  • H04N 1/00 is an application place for a large number of IT technologies, which are covered per se by the corresponding functional places
  • Image servers, hosts and clients use internally specific computing techniques. Corresponding techniques used in general computing are found in G06F or G06Q. This concerns data storage, software architectures, error detection or correction in general computing, monitoring, image retrieval, browsing, Internet browsing, computer security, billing or advertising
  • Image servers, hosts and clients use specific telecommunication techniques for the image transmission process. Corresponding techniques used in generic telecommunication networks are found in subclasses H04B, H04H, H04L, H04M, H04W. This concerns monitoring or testing of transmitters/receivers, broadcast or multicast, maintenance, administration, testing, data processing in data switching networks, home networks, real-time data network services, data network security, applications for data network, wireless networks per se
  • Image scanners use specific scanning techniques. Corresponding techniques are found in G02B. This concerns optical scanning systems
  • Image reproducers use specific reproduction techniques. Corresponding techniques are found in B41J, G03G, G06K. This concerns printing, electrography, producing a permanent visual presentation of output data
  • General image processing techniques are found in G06T
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning details of electrically scanned solid-state devices

H04N 3/14

Scanning of motion picture films

H04N 3/36

Television signal recording

H04N 5/76

Circuits for processing colour television signals

H04N 9/64

Capture aspects of still video cameras

H04N 23/00

Printing mechanisms

B41J

Supporting or handling copy material in printers

B41J 11/00, B41J 13/00, B41J 15/00

Handling thin or filamentary material

B65H

Colorimetry

G01J

Electrography; Magnetography

G03G

Handling of copy material in photocopiers

G03G 15/65

Constructional details of equipement or arrangements specially adapted for portable computer application

G06F 1/1626

Power management in computer systems

G06F 1/3203

Input and output arrangements for computers

G06F 3/00

Interaction techniques for graphical user interfaces

G06F 3/048

Storage management

G06F 3/0604

Digital output to printers

G06F 3/12

Adressing or allocating within memory systems or architectures

G06F 12/00

Image retrieval

G06F 16/50

Retrieval from Web

G06F 16/95

Computer security

G06F 21/00

Sensing record carriers

G06K 7/00

Producing a permanent visual presentation of output data

G06K 15/00

Payment schemes, Commerce

G06Q 20/00, G06Q 30/00

General-purpose image data processing

G06T 1/00

Image watermarking

G06T 1/0021

Geometric image transformation in the plane of the image

G06T 3/00

Image enhancement or restoration

G06T 5/00

Image analysis

G06T 7/00

Image coding

G06T 9/00

Editing figures and text; Combining figures or text

G06T 11/60

Methods or arrangements for recognising scenes

G06V 20/00

Character recognition, recognising digital ink or document-oriented image-based pattern recognition

G06V 30/00

Methods or arrangements for recognising human body or animal bodies or body parts

G06V 40/10

Methods or arrangements for acquiring or recognising human faces, facial parts, facial sketches, facial expressions

G06V 40/16

Access-control involving the use of a pass

G07C 9/20

Access-control by means of a password

G07C 9/33

Coding, decoding or code conversion, for error detection or error correction

H03M 13/00

Monitoring or testing of transmitters/receivers

H04B 17/00

Broadcast communication

H04H

Secret communication; Jamming of communication

H04K 1/00, H04K 3/00

Arrangements for detecting or preventing errors in the information received

H04L 1/00

Arrangements for secret or secure communication; Encryption

H04L 9/00

Charging arrangements in data networks

H04L 12/14

Data processing in data switching networks

H04L 12/56

Data network security

H04L 63/00

Real-time data network services

H04L 65/00

Applications for data network services

H04L 67/00

Simultaneous speech and telegraphic or other data transmission over the same conductors

H04M 11/06

Telephonic metering arrangements

H04M 15/00

Connection management in wireless communications networks

H04W 76/00

Special rules of classification

In this main group Indexing Codes are used:

The numbering of the codes is based on the numbering of the subgroups;

  • codes, e.g. H04N 1/0455 , which have a numbering the first part of which corresponds to a subgroup which is at the tip end of a subgroup branch, e.g. H04N 1/0402, are used to classify detailed information and may be applied to that subgroup, e.g. H04N 1/0455 may be used in combination with H04N 1/0402;
  • codes, e.g. H04N 2201/0402, which have a numbering the first part of which corresponds to a subgroup which is at the head or node end of a subgroup branch, e.g. H04N 1/04, are used to classify orthogonal information and may be applied to any subgroups in the corresponding subgroup branch, e.g. H04N 2201/0434 may be used in combination with H04N 1/0402 and/or H04N 1/1013.
Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Additional information

any information other than the still picture itself, but nevertheless associated with the still picture

Documents or the like

documents (both written and printed), maps, charts, photographs (other than cinematograph films)

Main-scan

the first completed scan

Mode

way or manner of operating

Scanning

the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa

Still picture apparatus

any apparatus generating, storing, transmitting or reproducing non-transient images

Single-mode communication

a communication in which the mode is not changed

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

IP

Internet Protocol

OS

Operating System

PC

Personal Computer

GPS

Global Positioning System

MFP

Multifunctional peripheral

MFD

Multifunctional device

RFID

Radio-frequency identification

In patent documents, the following words/expressions are often used as synonyms:

  • Complex device and Multifunctional peripheral
  • Complex machine and Multifunctional peripheral
  • Hybrid device and Multifunctional peripheral
  • Hybrid machine and Multifunctional peripheral
  • Digital camera and Still video camera
  • Metadata and Additional information
  • Fast scan and Main scan
  • Slow scan, Subscan and Sub scan
{Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for (error detection, error correction or monitoring in digital computers or digital computer components G06F 11/00)}
References
Limiting references

This place does not cover:

Determining the necessity for preventing unauthorised reproduction

H04N 1/0084

Detecting scanning velocity or position

H04N 1/047

Fault detection in circuits or arrangements for control supervision between transmitter and receiver or between image input and image output device

H04N 1/32609

Discrimination between different image types

H04N 1/40062

Discrimination between the two tones in the picture signal of a two-tone original

H04N 1/403

Control or modification of tonal gradation or extreme levels, e.g. dependent on the contents of the original or references outside the picture,

H04N 1/407

{Systems or arrangements for the transmission of the picture signal}
References
Limiting references

This place does not cover:

Transmitting or receiving computer data via an image communication device

H04N 1/00206

Transmitting or receiving image data via a computer or computer network

H04N 1/00209

Circuits or arrangements for control or supervision between transmitter and receiver

H04N 1/32

{in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Data processing systems for commerce

G06Q 30/00

{Transmitting or receiving image data, e.g. facsimile data, via a computer, e.g. using e-mail, a computer network, the internet, I-fax}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Message switching systems, e.g. e-mail systems

H04L 51/00

{Image push arrangements, e.g. from an image reading device to a specific network destination (push-based network services H04L 67/55)}
References
Limiting references

This place does not cover:

Push-based network services

H04L 67/55

{using an image reading or reproducing device, e.g. a facsimile reader or printer, as a local input to or local output from a computer (image input to or image output from a computer via a network H04N 1/00209)}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Portable computers comprising integrated printing or scanning devices

G06F 1/1696

{with a photographic apparatus, e.g. a photographic printer or a projector (photographic apparatus per se G03B, G03D)}
References
Limiting references

This place does not cover:

Photographic apparatus per se

G03B, G03D

Special rules of classification

Typically with apparatus of the kind classified in G03B, G03D or G03G.

{with an electrophotographic copying machine, i.e. a photocopier}
Special rules of classification

Typically with apparatus of the kind classified in G03G.

{with a printing apparatus, e.g. a laser beam printer}
Special rules of classification

Typically with apparatus of the kind classified in B41J or G06K 15/00.

{with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal (details of transmission H04N 1/00095; establishing a communication with one of a facsimile machine or another apparatus sharing a single line H04N 1/32704; interfacing cordless telephone terminals with an accessory to increase the functionality of user interface H04M 1/72409)}
Special rules of classification

Typically with apparatus of the kind classified in other H04 subclasses or other H04N main groups.

{with a television apparatus}
Special rules of classification

Typically with apparatus of the kind classified in other H04N main groups.

{with studio circuitry, devices or equipment, e.g. television cameras (television studio circuitry, devices or equipment per se H04N 5/222)}
References
Limiting references

This place does not cover:

Television studio circuitry, devices or equipment per se

H04N 5/222

Special rules of classification

Typically with apparatus of the kind classified in H04N 5/222 and subgroups.

{with receiver circuitry (television receiver circuitry per se H04N 5/44)}
References
Limiting references

This place does not cover:

Television receiver circuitry per se

H04N 5/44

Special rules of classification

Typically with apparatus of the kind classified in H04N 5/44 and subgroups.

{with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus (arrangements for the associated working of recording or reproducing apparatus with related apparatus G11B 31/00)}
Special rules of classification

Typically with apparatus of the kind classified in G06K or G11B.

{Handling of original or reproduction media, e.g. cutting, separating, stacking}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Supporting or handling copy material in printers

B41J 11/00, B41J 13/00, B41J 15/00

Handling thin or filamentary material

B65H

Handling of copy material in photocopiers

G03G 15/65

{Preventing unauthorised reproduction}
References
Limiting references

This place does not cover:

Marking an unauthorised reproduction with identification

H04N 1/32101

Restricting access

H04N 1/4406

Informative references

Attention is drawn to the following places, which may be of interest for search:

Preventing copies being made in photocopiers

G03G 21/04

Details of scanning heads {; Means for illuminating the original}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of scanning arrangements

H04N 1/04

for picture information pick-up
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Television cameras

H04N 23/00

{with means for collecting light from a line or an area of the original and for guiding it to only one or a relatively low number of picture element detectors}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Light-guides per se

G02B 6/00

Heads optically focused on only one picture element at a time {(H04N 1/0281 takes precedence)}
References
Limiting references

This place does not cover:

Means for collecting light from a line or an area of the original and for guiding it to only one or a relatively low number of picture element detectors

H04N 1/0281

with photodetectors arranged in a substantially linear array
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning arrangements using multi-element arrays

H04N 1/19

Scanning arrangements {, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa} (H04N 1/387 takes precedence)
References
Limiting references

This place does not cover:

Composing, repositioning or otherwise modifying originals

H04N 1/387

Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of scanning heads

H04N 1/024

Optical scanning systems

G02B 26/10

Projection optics in photocopiers

G03G 15/0409

Character printers involving the fast moving of a light beam in two directions

G06K 15/1228

Special rules of classification

Where possible both the main and sub scanning arrangements should be classified, using a class for the invention and an Indexing Code for subsidiary information. Manual scanning and scanning using two-dimensional arrays are exceptions to this rule.

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Main scan direction

The direction of the first completed scan line

Detection, control or error compensation of scanning velocity or position ({H04N 1/0402 and } H04N 1/17 take precedence)
References
Limiting references

This place does not cover:

Scanning different formats; Scanning with different densities of dots per unit length, e.g. different numbers of dots per inch (dpi); Conversion of scanning standards

H04N 1/0402

The scanning speed being dependent on content of picture

H04N 1/17

Informative references

Attention is drawn to the following places, which may be of interest for search:

Detection, control or error compensation of scanning velocity or opposition in photographic character printers involving the fast moving of an optical beam in the main scan direction

G06K 15/1219

Special rules of classification

Where possible, when classifying in this subgroup, details of the main and subscan should also be classified using other subgroups of H04N 1/04.

{in subscanning direction, e.g. picture start or line-to-line synchronisation}
Special rules of classification

Where possible, when classifying in this subgroup, details of the main scan should also be classified using other subgroups of H04N 1/04.

in main scanning direction, e.g. synchronisation of line start or picture elements in a line
Special rules of classification

Where possible, when classifying in this subgroup, details of the subscan should also be classified using other subgroups of H04N 1/04.

using cylindrical picture-bearing surfaces {, i.e. scanning a main-scanning line substantially perpendicular to the axis and lying in a curved cylindrical surface}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Feeding a sheet in the subscanning direction by rotation about its axis only

H04N 1/12

using flat picture-bearing surfaces {(H04N 1/113, H04N 1/195 take precedence)}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements for the main-scanning

H04N 1/12

using oscillating or rotating mirrors
References
Limiting references

This place does not cover:

Optical details of the scanning system

G02B 26/10

Informative references

Attention is drawn to the following places, which may be of interest for search:

Character printers involving the fast moving of a light beam in two directions

G06K 15/1228

{for the main-scan only}
References
Limiting references

This place does not cover:

Optical details of the scanning system

G02B 26/10, G02B 26/12

Informative references

Attention is drawn to the following places, which may be of interest for search:

Optical printers using dot sequential main scanning by means of a light deflector

B41J 2/471

Character printers involving the fast moving of an optical beam in the main scan direction

G06K 15/1204

using the sheet-feed movement {or the medium-advance or the drum-rotation movement} as the slow scanning component, {e.g. arrangements for the main-scanning} ({sheet-feed movement by translatory movement of a flat picture-bearing surface H04N 1/1008; main-scanning using oscillating or rotating mirrors H04N 1/113; } using multi-element arrays H04N 1/19)
References
Limiting references

This place does not cover:

Scanning arrangements using multi-element arrays

H04N 1/19

Informative references

Attention is drawn to the following places, which may be of interest for search:

Character printers involving the fast moving of an optical beam in the main scan direction

G06K 15/1204

using multi-element arrays
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Optical printers using arrays of radiation sources

B41J 2/447

Photographic character printers simultaneously exposing more than one point

G06K 15/1238

{Simultaneously or substantially simultaneously scanning picture elements on more than one main scanning line, e.g. scanning in swaths}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Photographic character printers simultaneously exposing more than one point on more than one main scanning line

G06K 15/1257

Simultaneously {or substantially simultaneously} scanning picture elements on one main scanning line
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of the sub-scanning

H04N 1/10, H04N 1/12

Photographic character printers simultaneously exposing more than one point on one main scanning line

G06K 15/1242

Intermediate information storage (H04N 1/387, H04N 1/41 take precedence {; for control between transmitter and receiver or between image input and image output device H04N 1/32358; indexing, editing G11B 27/00})
Definition statement

This place covers:

Where the storage results in a record that is not merely transient.

References
Limiting references

This place does not cover:

Storage resulting in a transient record, for control or supervision between image input and image output device

H04N 1/32358

Composing, repositioning or otherwise modifying originals

H04N 1/387

Bandwidth or redundancy reduction

H04N 1/41

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Intermediate

having no limiting meaning

{using still video cameras}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Image capture in digital cameras

H04N 23/00

Still video cameras

H04N 2101/00

Reproducing arrangements
Definition statement

This place covers:

Arrangements providing the output copy of a document in a system performing the scanning, transmission and reproduction of documents or the like, e.g. printing arrangements integrated within a facsimile device

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of scanning heads

H04N 1/024

Scanning arrangements

H04N 1/04

Perforating or marking objects by electrical discharge

B26F 1/28

Selective printing mechanisms per se

B41J

involving production of a magnetic intermediate picture
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Magnetography

G03G 19/00

involving production of an electrostatic intermediate picture
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Electrography

G03G

Circuits or arrangements for control or supervision between transmitter and receiver {or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device (H04N 1/38, H04N 1/387 take precedence)}
Definition statement

This place covers:

Where the control or supervision is between two devices that can be embedded within the same apparatus or be embedded in multiple apparatuses.

Where the front-end device has images that are intended to be sent to the back-end device and the control can be from either the front-end device, the back-end device or both.

For example:

  • a still-image camera or scanner and an another separate device (i.e. printer, display, server)
  • a still-image camera or multi-function peripheral [MFP] and its internal memory
References
Limiting references

This place does not cover:

Circuits or arrangements for blanking or otherwise eliminating unwanted parts of pictures

H04N 1/38

Composing, repositioning or otherwise modifying originals

H04N 1/387

Informative references

Attention is drawn to the following places, which may be of interest for search:

Devices for controlling television cameras or cameras comprising electronic image sensors

H04N 23/60

Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device

H04N 2201/32

Digital output from electrical digital data processing unit to print unit

G06F 3/12

Real-time session management in data packet switching networks

H04L 65/1066

Session management in data packet switching networks

H04L 67/14

{Automation of particular receiver jobs, e.g. rejecting unwanted calls (requesting a communication from a transmitter H04N 1/32771; with picture signal storage for forwarding messages H04N 1/32358)}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Automatic arrangements for answering calls in telephonic equipment

H04M 1/64

{Automation of particular transmitter jobs, e.g. multi-address calling, auto-dialing}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Telephonic equipment for signalling identity of wanted subscriber

H04M 1/26

{Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Television systems for the transmission of television signals using pulse code modulation, using bandwidth reduction involving the insertion of extra data

H04N 19/467

Television bitstream transport arrangements involving transporting of additional information

H04N 21/23614

Broadcast communication systems specially adapted for using meta-information

H04H 60/73

Synonyms and Keywords

In patent documents, the following words/expressions are often used as synonyms:

  • "metadata" for "additional information"
{embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Image watermarking

G06T 1/0021

Audio watermarking

G10L 19/018

{Transform domain methods (H04N 1/32309 takes precedence)}
References
Limiting references

This place does not cover:

In colour image data

H04N 1/32309

Informative references

Attention is drawn to the following places, which may be of interest for search:

Transmission of digital television signals using bandwidth reduction and involving the insertion of extra data

H04N 19/467

{using picture signal storage, e.g. at transmitter (H04N 1/17 takes precedence)}
Definition statement

This place covers:

Storage results in a transient record, e.g. buffering

References
Limiting references

This place does not cover:

Scanning speed being dependent on content of picture

H04N 1/17

Storage resulting in a record which is other than merely transient

H04N 1/21

{intermediate the transmitter and receiver terminals, e.g. at an exchange}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Stored and forward switching systems in transmission of digital information

H04L 12/54

{related to a single-mode communication, e.g. at the transmitter or at the receiver}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Coding, decoding or code conversion, for error detection or error correction

H03M 13/00

Arrangements for detecting or preventing errors in received digital information

H04L 1/00

{Establishing a communication with one of a facsimile and another telecommunication apparatus sharing a single line}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Simultaneous speech and other data transmission over the same conductors in telephonic communication systems

H04M 11/06

Mode signalling or mode changing; Handshaking therefor
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Negotiation of communication capabilities for communication control in transmission of digital information

H04L 69/24

{during transmission, input or output of the picture signal; within a single document or page}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems modifying digital information transmission characteristics according to link quality

H04L 1/0001

for coin-freed systems {; Pay systems}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Coin-freed or like apparatus per se

G07F

Telephonic metering

H04M 15/00

for synchronising or phasing transmitter and receiver
Definition statement

This place covers:

Obsolete subject matter, analog facsimile communication.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Synchronisation of pulses

H03K 4/90

Circuits or arrangements for blanking or otherwise eliminating unwanted parts of pictures (H04N 1/387 takes precedence)
Definition statement

This place covers:

Removing parts of the image e.g. smudges, extracting part of an image, screen out unwanted image regions, removing finger shadow, removing perforated holes when copying a perforated paper.

References
Limiting references

This place does not cover:

Composing, repositioning or otherwise modifying originals

H04N 1/387

Special rules of classification

Drop out for parts of the image while changing color is in H04N 1/62, form drop out data in H04N 1/4177.

Composing, repositioning or otherwise {geometrically} modifying originals
Definition statement

This place covers:

Composing e.g. combining 2 images. Reading of books and correction for geometric distortions due to curved (book page) original. Geometric modifications caused through warping of image.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Photoelectric composing of characters

B41B 19/00

Editing, producing a composite image by copying with focus on copy machine

G03G 15/36

Text processing

G06F 40/10

Pagination

G06F 40/114

Image data processing or generation, in general

G06T

Geometric modification and warping in general

G06T 3/00

Teaching/communicating with deaf, blind, mute people

G09B 21/00

{the composed originals being of different kinds, e.g. low- and high-resolution originals}
Definition statement

This place covers:

Eg. combining chart, text, logo (low resolution/bit depth) and photo (high resolution/bit depth) or foreground and background, with focus on image processing. Also high dynamic range (HDR) imagery when combined with H04N 1/407.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Inserting foreground into background with focus on camera

H04N 5/272

Combining objects while rendering PDL

G06T 11/60

{Repositioning or masking}
Definition statement

This place covers:

Image cropping, cutting out, masking with arbitrary shape.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Selection / ordering of images (from movies)

G11B 27/034

{defined only by a limited number of coordinate points or parameters, e.g. corners, centre; for trimming}
Definition statement

This place covers:

User defines the corner coordinates to extract image for repositioning. Cutting out, cropping, number of points is important. Low resolution pre-scan and high-resolution main scan of part of platen.

{combined with enlarging or reducing (enlarging or reducing per se H04N 1/393)}
Definition statement

This place covers:

Part of the image is enlarged/reduced to fit new position. Reducing for medium, zoom, belief map.

References
Limiting references

This place does not cover:

Corrections or small zoom factors

H04N 1/3873

Enlarging or reducing

H04N 1/393

{Recombination of partial images to recreate the original image}
Definition statement

This place covers:

Combining two images which have been scanned by a scanner which does not cover the entire image. Panoramic image creation, combination, stitching. Process is done digitally and not mechanically.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Mechanical corrections

H04N 2201/0402

Mosaic images or mosaicing.

G06T

Determination of transform parameters for the alignment of images, i.e. image registration

G06T 7/30

{Image rotation}
Definition statement

This place covers:

Rotating the image by any amount, e.g. 90degree. Also when printing double sided or 4 images on 1 page.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

When focus is on image processing.

G06T 3/60

{Skew detection or correction}
Definition statement

This place covers:

Limited to detecting and correcting skew, i.e. errors during scanning: normally less than 45degree.

Relationships with other classification places

See also in G06V 10/243.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Mechanical skew detection

H04N 1/00681

Enlarging or reducing
Definition statement

This place covers:

Mainly the mechanical enlargement process, whole image, DIN A4 to DIN A3 (larger than DIN A4).

Special rules of classification

This subgroup takes precedence over H04N 1/04.

{with modification of image resolution, i.e. determining the values of picture elements at new relative positions}
Definition statement

This place covers:

Digitally enlarging or reducing images with a change of resolution including e.g. interpolation (digital).

Relationships with other classification places

Beware of H04N 1/40068 which has resolution conversion where physical size is irrelevant.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Interpolation in general

G06T 3/40

Picture signal circuits (H04N 1/387 takes precedence)
Definition statement

This place covers:

General documents regarding quality aspects, quantization (errors), scanning either B/W or color, video printer, frame grabber, memory arrangement or management, smear reduction for CCD.

References
Limiting references

This place does not cover:

Composing, repositioning or otherwise modifying originals

H04N 1/387

Informative references

Attention is drawn to the following places, which may be of interest for search:

Moving images, e.g. television

H04N 5/14

{Conversion of colour to monochrome}
Definition statement

This place covers:

Converting coloured documents into B&W so they can be printed on monotone printers, e.g. changing green into stripes, red into dots.... Converting from RGB via thresholding to grayscale.

{Circuits exciting or modulating particular heads for reproducing continuous tone value scales (H04N 1/401, H04N 1/407 take precedence)}
Definition statement

This place covers:

Writing: control of print heads, stilus heads, electrostatic heads. Continuous driving signals.

Relationships with other classification places

Overlap with G06K 15/12.

References
Limiting references

This place does not cover:

Compensating positionally unequal response of the pick-up or reproducing head

H04N 1/401

Control or modification of tonal gradation or of extreme levels

H04N 1/407

Informative references

Attention is drawn to the following places, which may be of interest for search:

Multipass inkjet

G06K 15/102

{for a plurality of reproducing elements simultaneously}
Definition statement

This place covers:

Writing: multiple print elements, essentially LED or thermal printheads, but also using several lasers in parallel.

{the reproducing element being a laser}
Definition statement

This place covers:

Mainly continuous tone laser printers.

{Circuits for driving or energising particular reading heads or original illumination means (H04N 1/401, H04N 1/407 take precedence)}
Definition statement

This place covers:

Control of light during reading of a document; circuits for driving diodes, analogue switches for light control. Also exposure time of sensor etc.

References
Limiting references

This place does not cover:

Compensating positionally unequal response of the pick-up or reproducing head

H04N 1/401

Control or modification of tonal gradation or of extreme levels

H04N 1/407

Informative references

Attention is drawn to the following places, which may be of interest for search:

Mechanical details

H04N 1/028

Lamps per se

H05B 39/041, H05B 41/38

{Discrimination between different image types, e.g. two-tone, continuous tone}
Definition statement

This place covers:

Image segmentation, finds regions in bitmap image e.g. text, table, photo, line image; also "mixed raster content" or "MRC".

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Segmentation; Edge detection in general

G06T 7/10

Character recognition, OCR

G06V 30/40

{Modification of image resolution, i.e. determining the values of picture elements at new relative positions (H04N 1/3935 takes precedence)}
Definition statement

This place covers:

Change resolution while physical size is irrelevant, e.g. original image is 600dpi and printer is only capable of printing 300dpi, so conversion is necessary.

References
Limiting references

This place does not cover:

With modification of image resolution, i.e. determining the values of picture elements at new relative positions

H04N 1/3935

Informative references

Attention is drawn to the following places, which may be of interest for search:

Increasing or decreasing spatial resolution

G06K 15/1872

{Descreening, i.e. converting a halftone signal into a corresponding continuous-tone signal; Rescreening, i.e. combined descreening and halftoning}
Definition statement

This place covers:

Descreening and/or rescreening, self-explanatory.

{Soft dot halftoning, i.e. producing halftone dots with gradual edges}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Resolution enhancement by intelligently pacing sub-pixels when focus is on write head control.

H04N 1/40025

General edge enhancement

H04N 1/409

{Multi-toning, i.e. converting a continuous-tone signal for reproduction with more than two discrete brightnesses or optical densities, e.g. dots of grey and black inks on white paper}
Definition statement

This place covers:

Provides more than just level 0 and level 255 for image, e.g. has levels 0, 127 and 255, i.e. multi-level halftoning. Typical documents: EP817464 (Seiko) shows two types of ink C1 and C2 (color multi-toning in H04N 1/52), EP889639 (Xerox) shows levels of white, light gray, dark gray and black.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Variation of dot size

H04N 1/4057

General bit depth reduction

H04N 19/90

{Modification of content of picture, e.g. retouching (geometric modifications H04N 1/387)}
Definition statement

This place covers:

Very few applications. Local modifications, e.g. making lighter and posterization of natural images.

Compensating positionally unequal response of the pick-up or reproducing head (H04N 1/403 takes precedence)
Definition statement

This place covers:

Limited to image readers, mostly line sensors. Shading correction, illumination profile, head calibration, positionally varying noise etc. Also defects in the image sensors. Compensation of ambient illumination.

References
Limiting references

This place does not cover:

Discrimination between the two tones in the picture signal of a two-tone original

H04N 1/403

Informative references

Attention is drawn to the following places, which may be of interest for search:

Ambient illumination also in

H04N 1/00835

Control of light source

H04N 1/40056

Correction of isolated defects in image

H04N 1/409

Defect maps for area sensors

H04N 25/63

{of the reproducing head}
Definition statement

This place covers:

Printers, corrects misaligned or defective heads; head calibration.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Malfunctioning inkjet nozzles

B41J 2/165

Discrimination between the two tones in the picture signal of a two-tone original
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Shaping pulses by limiting or thresholding, in general

H03K 5/08

Halftoning, i.e. converting the picture signal of a continuous-tone original into a corresponding signal showing only two levels
Definition statement

This place covers:

Halftoning in general, either B&W only or each color layer separately. Examples are EP126782, EP673150.

{producing a dispersed dots halftone pattern, the dots having substantially the same size (different sizes H04N 1/4057)}
Definition statement

This place covers:

Dispersed dots, i.e. dots that are not concentrated in clusters which spread out from a central point. Examples are US5317418 - e.g. Gaussian filter, blue noise; US5426122 - FM rasters.

{by error diffusion, i.e. transferring the binarising error to neighbouring dot decisions}
Definition statement

This place covers:

Error diffusion for halftoning, note that error diffusion is also used for other purposes in other parts of H04N 1/00. Examples are EP507354, EP808055.

{with threshold modulated relative to input image data or vice versa}
Definition statement

This place covers:

Illustrative examples of subject matter classified in this group are WO8906080, EP715451.

{producing a clustered dots or a size modulated halftone pattern}
Definition statement

This place covers:

Halftone dots grow from a central point and spread in all directions. Also dispersed clusters. Illustrative examples are US3688033, EP651560.

{the pattern varying in one dimension only, e.g. dash length, pulse width modulation [PWM]}
Definition statement

This place covers:

Growth of halftone dot in one direction only, includes Pulse Width Modulation. Illustrative examples are EP212990, US4951152.

{the pattern being a mixture of differently sized sub-patterns, e.g. spots having only a few different diameters (multi-toning H04N 1/40087)}
Definition statement

This place covers:

Different dot sizes, each dot has the same density. Illustrative examples are EP647059 (fig.5), US4680645 (fig.1).

Special rules of classification

For dots of different densities (inks) classify in H04N 1/40087.

{with details for producing a halftone screen at an oblique angle (H04N 1/4056 takes precedence)}
Definition statement

This place covers:

Illustrative examples are GB2026283, WO9307709.

References
Limiting references

This place does not cover:

Pattern varying in one dimension only

H04N 1/4056

Control or modification of tonal gradation or of extreme levels, e.g. background level
Definition statement

This place covers:

Selection of particular gamma correction table, correction depending on media scanned or printed on, film type correction, correction of tone scale for dot gain.

Relationships with other classification places

Similar to H04N 1/6027 for colour.

{dependent on the contents of the original}
Definition statement

This place covers:

Analysis of image content to determine final correction to be applied, e.g. automatic background deletion.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Conversion to binary

H04N 1/403

{using histograms}
Definition statement

This place covers:

Histogram analysis to determine tone correction parameters.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

In context of pure image processing

G06T 5/40

{dependent on references outside the picture}
Definition statement

This place covers:

Pre-scanning to read reference strips (B&W), which is used to set max and min levels. Very limited test patterns containing only black (offset correction) and white (gain correction), e.g. printed next to an image or as separate image. Standard pattern on monitor (no light for black reference and light on for white reference).

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Monitor calibration per se

H04N 5/202, H04N 9/69

{using gradational references, e.g. grey-scale test pattern analysis}
Definition statement

This place covers:

Test pattern analysis for gray scale corrections.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Edge or detail enhancement; Noise or error suppression
Definition statement

This place covers:

Noise or error correction. Elimination of "streaky effects".

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning correction due to reader error

H04N 1/401

Image enhancement or restoration

G06T 5/00

Noise filtering in arrangements for image or video recognition or understanding

G06V 10/30

{Edge or detail enhancement}
Definition statement

This place covers:

Fairly self-explanatory. Has also first edge detection and then correction. Edge emphasis, sharpness correction, unsharp masking, smoothing.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

For color

H04N 1/58

For cameras

H04N 5/208

Deblurring; Sharpening

G06T 5/73

{Correction of errors due to scanning a two-sided document, i.e. show-through correction}
Synonyms and Keywords

In patent documents, the following words/expressions are often used as synonyms:

  • "show-through" and "see-through"
{Removing errors due external factors, e.g. dust, scratches}
Definition statement

This place covers:

Removal of streaks, dust, blemishes, tears, scratches, hairs. Removing scratches from photographs using infrared image.

Bandwidth or redundancy reduction (by scanning H04N 1/17 {; methods or arrangements for coding, decoding, compressing or decompressing digital video signals H04N 19/00})
Definition statement

This place covers:

General coding groups for still images, B&W, gray scale or each color component separately. This head group has using different coding techniques within the same document, combination of different techniques, or choosing from different available coding methods (e.g. characters with technique 1, pictures with technique 2).

References
Limiting references

This place does not cover:

Bandwidth or redundancy reduction by scanning

H04N 1/17

Television systems for the transmission of television signals using bandwidth reduction

H04N 19/00

For mixed image compression

H04N 19/12

Informative references

Attention is drawn to the following places, which may be of interest for search:

Coding of color images

H04N 1/64

Bandwidth or redundancy reduction for data acquisition

G06F 17/40

Coding for image data processing in general

G06T 9/00

Data Compression in general

H03M 7/30

{for halftone screened pictures}
Definition statement

This place covers:

Image to be coded must be halftoned image only.

for the transmission {or storage} or reproduction of two-tone pictures, e.g. black and white pictures
Definition statement

This place covers:

B&W images, i.e. binary coding.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Continuous tone compression

H04N 19/00

{involving the recognition of specific patterns, e.g. by symbol matching}
Definition statement

This place covers:

Eg. huffman coding.

Systems or arrangements allowing the picture to be reproduced without loss or modification of picture-information
Definition statement

This place covers:

Lossless coding, has a variety of coding methods, e.g. comparing different codings of a line and choosing shortest code; universal coding.

in which the picture-elements are subdivided or grouped into fixed one-dimensional or two-dimensional blocks
Definition statement

This place covers:

Block coding, also mix of Huffman and run-length coding.

using predictive or differential encoding
Definition statement

This place covers:

Predictive coding, arithmetic coding.

{Progressive encoding, i.e. by decomposition into high and low resolution components}
Definition statement

This place covers:

Different resolutions of the image, wavelet coding for binary images.

{involving the encoding of tone transitions with respect to tone transitions in a reference line}
Definition statement

This place covers:

Differential coding, i.e. coding the change data between two lines.

{encoding document change data, e.g. form drop out data}
Definition statement

This place covers:

Templates, encodes the data change only; encode difference of image when template is known, e.g. scanned images of filled out form sheets.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Color form drop-out

G06V 10/143

in which encoding of the length of a succession of picture-elements of the same value along a scanning line is the only encoding step {(H04N 1/4135 - H04N 1/417 take precedence)}
Definition statement

This place covers:

B&W runlength encoding.

References
Limiting references

This place does not cover:

Baseband signal showing more than two values or a continuously varying baseband signal is transmitted or recorded

H04N 1/4135

Systems or arrangements allowing the picture to be reproduced without loss or modification of picture-information using predictive or differential encoding

H04N 1/417

Systems for two-way working {, e.g. conference systems (H04N 1/32 takes precedence)}
References
Limiting references

This place does not cover:

Circuits or arrangements for control or supervision between transmitter and receiver

H04N 1/32

Informative references

Attention is drawn to the following places, which may be of interest for search:

Television systems for two-way working

H04N 7/14

Selective content distribution, e.g. interactive television

H04N 21/00

Secrecy systems
References
Limiting references

This place does not cover:

Preventing unauthorised reproduction

H04N 1/00838

Informative references

Attention is drawn to the following places, which may be of interest for search:

Analogue secrecy television systems

H04N 7/16

Security arrangements for protecting computers or computer systems against unauthorised activity

G06F 21/00

Secret communication in general

H04K

Arrangements for secret or secure communication in transmission of digital information

H04L 9/00

{Restricting access, e.g. according to user identity (mechanisms actuated y cards, PIN or the like in apparatus for dispensing G07F 7/08)}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Restricting access to computer systems

G06F 21/30

Access-control involving the use of a pass

G07C 9/20

Verifying the identity or authority of a user of a system for the transmission of digital information

H04L 9/32

Protecting transmitted digital information from access by third parties

H04L 63/04

Access control in transmission of digital information

H04L 63/10

{Rendering the image unintelligible, e.g. scrambling}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems rendering a television signal unintelligible and subsequently intelligible

H04N 7/167

Ciphering or deciphering apparatus for cryptographic purposes

G09C

Secret communication by adding a second signal to make the desired signal unintelligible

H04K 1/02

{using digital data encryption}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements for secret or secure communication using public key encryption algorithm

H04L 9/30

Colour picture communication systems
Definition statement

This place covers:

Colour edit systems, printers with different recording modes for color and monochrome, decision as to print/scan color or B&W, general color applications for fax. Very general group.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Colorimetry

G01J 3/46

{Conversion of monochrome to colour}
Definition statement

This place covers:

Very straightforward, conversion into color documents, e.g. pattern chart to color (opposite to H04N 1/40012). Generating false color representations.

Picture signal generators (for halftone screening H04N 1/52)
Definition statement

This place covers:

Color image readers, hardware of apparatuses.

References
Limiting references

This place does not cover:

Circuits or arrangements for halftone screening

H04N 1/52

{using the same detector device sequentially for different colour components}
Definition statement

This place covers:

Filter wheels to separate components.

{with sequential colour illumination of the original}
Definition statement

This place covers:

The use of different lights to read the image, e.g. first R, then G, finally B, e.g. successive RGB LED lighting.

{with separate detectors, each detector being used for one specific colour component}
Definition statement

This place covers:

Using separate R, G and B sensor elements, typically line sensors. Has also documents on correcting chromatic aberrations of 3-line CCD sensor, also RGB sensor with additional monochrome sensor.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

For area sensors (Bayer matrix)

H04N 23/10

Demosaicing

G06T 3/4015, H04N 23/10

{using beam-splitters}
Definition statement

This place covers:

Splitting light using prisms, half (dichroic) mirrors, diffraction grating - most applications deal with line sensors.

Picture reproducers (for halftone screening H04N 1/52)
Definition statement

This place covers:

Color printers, hardware of apparatuses.

References
Limiting references

This place does not cover:

Circuits or arrangements for halftone screening

H04N 1/52

{Reproducing the colour component signals dot-sequentially or simultaneously in a single or in adjacent picture-element positions}
Definition statement

This place covers:

Dot by dot printing, point-wise scanning, essentially either inkjet or laser beam printer.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

More details on inkjets

B41J 2/21

{Reproducing the colour component signals line-sequentially}
Definition statement

This place covers:

Line-by-line printing.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Alignment of dots

B41J 2/2135

{Reproducing the colour component signals picture-sequentially, e.g. with reproducing heads spaced apart from one another in the subscanning direction}
Definition statement

This place covers:

Picture-by-picture printing, i.e. one complete color separation after the other. Focus on image signal circuits, e.g. start-of-scan determination, sync marks on print medium, misregistration correction correcting misalignment of individual print heads with respect to each other. Facet or face-to-face errors. This is the typical way color laser printers work, when one latent image is generated after the other and one after the other developped and transferred.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Trapping is also used against misregistration, but is an image modification

H04N 1/58

Temperature

G02B 26/121

Purely mechanical corrections

G09G15/01

{using the same reproducing head for two or more colour components}
Definition statement

This place covers:

Using one drum for more than one color, thermal transfer printers.

Circuits or arrangements for halftone screening
Definition statement

This place covers:

Colour halftoning, colour multi-toning e.g. with use of more than one cyan (C1 and C2), screens, error diffusion.

Special rules of classification

H04N 1/40087 or some subgroup of H04N 1/405 may be applied additionally to H04N 1/52.

Conversion of colour picture signals to a plurality of signals some of which represent particular mixed colours, e.g. for textile printing
Definition statement

This place covers:

Printing with additional colours, e.g. using orange and brown pigments additionally or white or gold, CMYKRGB printers.

Processing of colour picture signals (H04N 1/52 takes precedence)
Definition statement

This place covers:

General color image processing, color to 2-color converstion (e.g.. RGB to black/red). Film type, document type, slide type, text+image, detection of mouse marker.

References
Limiting references

This place does not cover:

Circuits or arrangements for halftone screening

H04N 1/52

Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction (H04N 1/62 takes precedence)
Definition statement

This place covers:

Self-explanatory regarding noise and edge. A substantial part of this subgroup deals with trapping (spreading and choking image objects), either on bitmap or on page description language (PDL) level.

References
Limiting references

This place does not cover:

Retouching, i.e. modification of isolated colours only or in isolated picture areas only

H04N 1/62

Informative references

Attention is drawn to the following places, which may be of interest for search:

For integration of trapping in PDL workflow

G06K 15/1826

Colour correction or control {(H04N 1/54 takes precedence)}
Definition statement

This place covers:

All kinds of color correction. Estimating spectrum from XYZ input.

References
Limiting references

This place does not cover:

Conversion of colour picture signals to a plurality of signals some of which represent particular mixed colours

H04N 1/54

{with simulation on a subsidiary picture reproducer (H04N 1/622 takes precedence; matching two or more picture reproducers H04N 1/6052)}
Definition statement

This place covers:

Color corrections involving representation of the image on monitor, e.g. for interactive correction or for use as soft proofer.

References
Limiting references

This place does not cover:

Matching printer and monitor for softproofing per se

H04N 1/6052, H04N 1/6055

With simulation on a subsidiary picture reproducer

H04N 1/622

{by simulating several colour corrected versions of the same image simultaneously on the same picture reproducer}
Definition statement

This place covers:

Fairly self-explanatory, typically the user selects one of the several similated, corrected images.

{Conversion to subtractive colour signals}
Definition statement

This place covers:

Usually transformations from RGB to CMY, but also used generally for transformations to output device values, as far as the focus is on the transformation. Here (matrix) equations are used.

{using look-up tables (H04N 1/6025 takes precedence)}
Definition statement

This place covers:

Look-up tables for color conversion, typically to CMY. Also interpolation methods to calculate the in-between values not stored in the tables, e.g. tetrahedal or cubic interpolations.

References
Limiting references

This place does not cover:

Generating a fourth subtractive colour signal using look-up tables

H04N 1/6025

{Generating a fourth subtractive colour signal, e.g. under colour removal, black masking}
Definition statement

This place covers:

Essentially the transformations to CMYK which involve use of equations. Gray component replacement (GCR), undercolor removal (UCR).

{using look-up tables}
Definition statement

This place covers:

Four-colour look-up tables, also their interpolation.

{Correction or control of colour gradation or colour contrast (H04N 1/6058 takes precedence)}
Definition statement

This place covers:

General control and correction of tone reproduction curves. Gray balance, white balance as result thereof. Aspects of saturation correction.

References
Limiting references

This place does not cover:

Reduction of colour to a range of reproducible colours

H04N 1/6058

Informative references

Attention is drawn to the following places, which may be of interest for search:

When focus is on white balance per se.

H04N 1/6077

White balance in cameras

H04N 9/73

{controlled by characteristics of the picture signal generator or the picture reproducer}
Definition statement

This place covers:

Device profiles, e.g. ICC profiles, profile management for several devices, profile editing.

{using test pattern analysis (H04N 1/6055 takes precedence)}
Definition statement

This place covers:

Printer or scanner calibration using color test patterns.

References
Limiting references

This place does not cover:

Matching two or more picture signal generators or two or more picture reproducers using test pattern analysis

H04N 1/6055

Informative references

Attention is drawn to the following places, which may be of interest for search:

For B&W

H04N 1/4078

Camera calibration

H04N 17/02

Color charts as such

G01G3/52

In electrophotography

G03G 15/5041

{Matching two or more picture signal generators or two or more picture reproducers}
Definition statement

This place covers:

Specifically matching two (or more) devices to each other, e.g. for proofing, i.e. printer to printer or printer to monitor.

{using test pattern analysis}
Definition statement

This place covers:

Limited to the two device scenario.

{Reduction of colour to a range of reproducible colours, e.g. to ink- reproducible colour gamut}
Definition statement

This place covers:

Gamut mapping and gamut conversion. Mainly within a device-independent space in order to map color reproducability of one device onto that of another device.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

In relation to general image processing and computer graphics

G06T 11/001

{adapting to different types of images, e.g. characters, graphs, black and white image portions}
Definition statement

This place covers:

Corrections to an image which depends on the type of image object, i.e. different corrections within one page, e.g. text and picture differently corrected.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Discrimination of image (object) types per se - (B&W),

H04N 1/40062

Discrimination of image (object) types per se - (colour).

H04N 1/56

{Corrections to the hue}
Definition statement

This place covers:

Only hue changes, not luminance or chroma or saturation.

References
Limiting references

This place does not cover:

Saturation correction

H04N 1/6027

{Colour balance, e.g. colour cast correction}
Definition statement

This place covers:

Correction of e.g. color fog or blue shift in image.

Special rules of classification

H04N 1/6027 has precedence.

{within the L, C1, C2 colour signals}
Definition statement

This place covers:

Eg. histogram technique in L*a*b* color space.

{controlled by factors external to the apparatus}
Definition statement

This place covers:

Environmental factors.

{by viewing conditions, i.e. conditions at picture output}
Definition statement

This place covers:

Eg. correction for sunlight on monitor, artifical lighting, flare.

{depending on characteristics of the input medium, e.g. film type, newspaper}
Definition statement

This place covers:

Different film types have different properties, thus need to be corrected. For newspaper, correction due to the yellowing is necessary.

Retouching, i.e. modification of isolated colours only or in isolated picture areas only
Definition statement

This place covers:

Correction limited to particular colors, e.g. the red of a red apple is selected and enhanced. Changing color information in a region.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

For skin color

H04N 1/628

{with simulation on a subsidiary picture reproducer}
Definition statement

This place covers:

With display of image on monitor for user selection and editing.

Systems for the transmission or the storage of the colour picture signal; Details therefor, e.g. coding or decoding means therefor {(H04N 19/00 takes precedence)}
Definition statement

This place covers:

Colour coding closely related to apparatus.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Compression of B&W

H04N 1/41

Compression per se

H04N 19/00

{Adapting to different types of images, e.g. characters, graphs, black and white image portions}
References
Limiting references

This place does not cover:

For different coding for different image types, but limited to B&W

H04N 1/41

Similar but for colour correction and not coding

H04N 1/6072

{using a reduced set of representative colours, e.g. each representing a particular range in a colour space}
Definition statement

This place covers:

Palletized colors, including methods of obtaining the palletization and their coding. Rounding, change from true color to 8bit using a pallette.

{Transmitting or storing colour television type signals, e.g. PAL, Lab; Their conversion into additive or subtractive colour signals or vice versa therefor (H04N 1/642, H04N 1/644 take precedence)}
Definition statement

This place covers:

Limited to e.g. YUV, Lab, etc.

References
Limiting references

This place does not cover:

Adapting to different types of images, e.g. characters, graphs, black and white image portions

H04N 1/642

Using a reduced set of representative colours, e.g. each representing a particular range in a colour space

H04N 1/644

{Transmitting or storing the primary (additive or subtractive) colour signals; Compression thereof (H04N 1/642 - H04N 1/646 take precedence)}
Definition statement

This place covers:

Limited to CMY or RGB, raw sensor data.

References
Limiting references

This place does not cover:

Adapting to different types of images, e.g. characters, graphs, black and white image portions

H04N 1/642

Transmitting or storing colour television type signals

H04N 1/646

Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning details of television systems; Combination thereof with generation of supply voltages
Definition statement

This place covers:

  • Scanning arrangements using moving aperture, refractor, reflector or lens
  • Scanning arrangements using switched light sources, solid-state devices or cathod-ray tube by deflecting elctron beams
  • Scanning arrangements for motion picture films
by optical-mechanical means only (H04N 3/36 takes precedence)
References
Limiting references

This place does not cover:

Scanning of motion picture films

H04N 3/36

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Picture signal generators using optical-mechanical scanning means only, for colour television systems

H04N 23/17

Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning systems using movable or deformable optical elements for controlling the intensity, colour, phase, polarisation or direction of light

G02B 26/10

by means not exclusively optical-mechanical (H04N 3/36 takes precedence)
References
Limiting references

This place does not cover:

Scanning of motion picture films

H04N 3/36

Informative references

Attention is drawn to the following places, which may be of interest for search:

Devices or arrangements for the control of the direction of light arriving from an independent light source

G02F 1/00

by means of electrically scanned solid-state devices (for picture generation H04N 25/00)
Definition statement

This place covers:

Scanning details of electrically scanned solid-state picture reproducers.

References
Limiting references

This place does not cover:

Transforming light or analogous information into electric information using solid-state image sensors

H04N 25/00

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Picture reproducers using solid-state colour display devices

H04N 9/30

by deflecting electron beam in cathode-ray tube {, e.g. scanning corrections}
Definition statement

This place covers:

Deflection circuits for cathode-ray tubes, when specially adapted for television

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Picture reproducers, specially adapted for colour television systems, using cathode-ray tubes

H04N 9/16

Informative references

Attention is drawn to the following places, which may be of interest for search:

Cathode ray oscillographs

G01R 13/20

Deflection circuits, of interest only in connection with cathode-ray tube indicators

G09G 1/04

Control arrangements or circuits using single beam cathode-ray tubes, the beam directly tracing characters, the information to be displayed controlling the deflection as a function of time in two spatial coordinates

G09G 1/08

Electric discharge tubes of discharge lamps

H01J

Linearisation of ramp of a sawtooth shape pulse

H03K 4/90

Maintaining dc voltage constant
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Regulation of dc voltage in general

G05F

Controlling dimensions (by maintaining the cathode-ray tube high voltage constant H04N 3/185)
References
Limiting references

This place does not cover:

Circuits for controlling dimensions of picture on screen by maintaining the cathode-ray tube high voltage constant

H04N 3/185

Informative references

Attention is drawn to the following places, which may be of interest for search:

Control arrangements or circuits using single beam cathode-ray tubes, the beam tracing a pattern independent of the information to be displayed, this latter determining the parts of the pattern rendered respectively visible and invisible

G09G 1/14

Modifications of scanning arrangements to improve focusing
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Arrangements for convergence or focusing in cathode-ray tubes specially adapted for colour television systems

H04N 9/28

Informative references

Attention is drawn to the following places, which may be of interest for search:

Focussing circuits in general

H01J

Circuits special to multi-standard receivers
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry of multi-standard receivers in general

H04N 5/46

Scanning of motion picture films, e.g. for telecine
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Scanning of colour motion picture films, e.g. for telecine

H04N 9/11

Informative references

Attention is drawn to the following places, which may be of interest for search:

Picture signal generating by scanning motion picture films or slide opaques, e.g. for telecine

H04N 5/253

Details of television systems (scanning details or combination thereof with generation of supply voltages H04N 3/00)
Definition statement

This place covers:

Hardware-related or software-related aspects of television signal processing at the transmitter side or the receiver side

Relationships with other classification places
  • H04N 5/00 distinguishes itself from synchronising techniques in transmission of a digital video signal with one or more other digital signals, which are found in H04N 7/00
  • H04N 5/00 distinguishes itself from picture signal processing and corresponding techniques, which are found in subclasses G06T, G09G. This concerns image processing not specific to a television signal (G06T) or video signal processing specific to visual displays (G09G), e.g. LCD or plasma panels
  • H04N 5/00 features transmitter techniques specially adapted to analog transmission of television signals. The corresponding function place for generic transmission are found in subclasses H04N 21/00, H04B, H04H, H04L, H04W. This concerns servers, broadcast or multicast, home networks, wireless networks per se.
  • H04N 5/00 features receiver techniques specially adapted to the reception of analog television signals. The corresponding place for digital television receivers is H04N 21/00.
References
Limiting references

This place does not cover:

Scanning details of television systems; Combination thereof with generation of supply voltages

H04N 3/00

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Details of colour television systems

H04N 9/00

Wall TV displays

H04N 9/12

Details of stereoscopic television systems

H04N 13/00

Servers specifically adapted for the selective distribution of content

H04N 21/20

Client devices specifically adapted for the reception of, or interaction with, content in selective content distribution

H04N 21/40

Informative references

Attention is drawn to the following places, which may be of interest for search:

Selective content distribution

H04N 21/00

Constructional details related to the housing of computer displays

G06F 1/1601

Constructional details or arrangements for portable computers

G06F 1/1613

Power management in computer systems

G06F 1/3203

Image enhancement or restoration

G06T 5/00

Image analysis

G06T 7/00

Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

G09G 3/00

Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

G09G 5/00

Diversity receivers

H04B 7/08

Broadcast synchronizing

H04H 20/18

Broadcast receivers

H04H 20/38

Synchronizing in TDMA

H04J 3/06

Receiver synchronizing

H04L 7/0012, H04L 7/0083

Home automation networks

H04L 12/2803

Special rules of classification

H04N 5/00 features a number of symbols corresponding to a same number of Indexing Codes (e.g., H04N 5/4448 as symbol and H04N 5/4448 as Indexing Code symbol).

Allocation of symbols and/or Indexing Code symbols:

  • A document containing invention information relating to details of television elements will be given a H04N 5/00 group.
  • A document containing additional information relating to details of television elements will be given a H04N 5/00 group.
  • A document merely mentioning further details of television elements will not be given a group, but it may receive an Indexing Code if the disclosure is considered relevant, e.g. when conversion of interlace to progressive scanning (H04N 7/012 ) involves motion estimation, H04N 5/145 is added.
Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Edging

detection of edges

Movement estimation

motion vector generation

KTC

Thermal noise on capacitor

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

GPS

global positioning system

PC

personal computer

STB

set top box

Synchronising (for television systems using pulse code modulation H04N 7/56)
References
Limiting references

This place does not cover:

Synchronising systems used in the transmission of pulse code modulated video signals with other pulse code modulated signal

H04N 7/56

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Colour synchronization

H04N 9/44

Synchronisation processes at server side for selective content distribution

H04N 21/242

Informative references

Attention is drawn to the following places, which may be of interest for search:

Synchronisation between a display unit and other units, e.g. other display units, video-disc players

G09G 5/12

Synchronisation of pulses having essentially a finite slope or stepped portions

H03K 4/90

Synchronisation of generators of electronic oscillations or pulses

H03L 7/00

Arrangements for synchronising receiver with transmitter in the transmission of digital information

H04L 7/00

Circuitry for reinsertion of dc and slowly varying components of signal; Circuitry for preservation of black or white level
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Circuits for reinsertion of dc and slowly varying components of colour signals

H04N 9/72

Circuitry for controlling amplitude response
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Circuits for controlling the amplitude of colour signals

H04N 9/68

Circuitry for suppressing or minimising disturbance, e.g. moiré or halo
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Suppression of noise in television signal recording

H04N 5/911

Studio circuitry; Studio devices; Studio equipment (cameras or camera modules comprising electronic image sensors, or control thereof H04N 23/00)
Definition statement

This place covers:

Circuitry, devices and other equipment specially adapted to be used in television studio, e.g. for mixing images or generation of special effects.

References
Limiting references

This place does not cover:

Cameras or camera modules comprising electronic image sensors; Control thereof

H04N 23/00

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Studio equipment related to broadcast communication

H04H 60/04

Informative references

Attention is drawn to the following places, which may be of interest for search:

Buildings for studios for broadcasting, cinematography, television or similar purposes

E04H 3/22

Picture signal generating by scanning motion picture films or slide opaques, e.g. for telecine (scanning details therefor H04N 3/36 {; standard conversion therefor H04N 7/0112})
Definition statement

This place covers:

Picture signal generation by scanning motion picture films i.e. cinematographic films in video signals e.g. telecine.

References
Limiting references

This place does not cover:

Scanning details therefor

H04N 3/36

Standards conversion therefor

H04N 7/0112

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Scanning of colour motion picture films, e.g. for telecine

H04N 9/11

Picture signal generators using flying-spot scanners (H04N 5/253 takes precedence)
Definition statement

This place covers:

Obsolete technology.

References
Limiting references

This place does not cover:

Picture signal generating by scanning motion picture films or slide opaques

H04N 5/253

Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects {; Cameras specially adapted for the electronic generation of special effects}
Definition statement

This place covers:

Studio circuits providing video special effects like combining different images, changing image aspect (geometric, orientation, etc.) or aesthetic/artistic aspect, providing transitions between images, background and foreground images synthesizing, mixing and switching.

Signal distribution or switching
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Arrangements for broadcast or for distribution combined with broadcast

H04H 20/00

Mobile studios
Definition statement

This place covers:

Mobile studios, e.g. television studio equipment installed in vehicles for outdoor broadcasting.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Broadcast communication aspects of mobile studios

H04H 60/05

Transforming light or analogous information into electric information (scanning details H04N 3/00; cameras or camera modules comprising electronic image sensors, or control thereof H04N 23/00; circuitry of solid-state image sensors [SSIS] or control thereof H04N 25/00)
Definition statement

This place covers:

Circuitry (electronic circuits) and driving.

References
Limiting references

This place does not cover:

Scanning details of television systems

H04N 3/00

Cameras or camera modules comprising electronic image sensors or control thereof

H04N 23/00

Circuitry of solid-state image sensors [SSIS]; Control thereof

H04N 25/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Light transforming elements

H01J, H01L

Transforming X-rays (cameras or camera modules for generating image signals from X-rays H04N 23/30; circuitry of SSIS for transforming X-rays into image signals H04N 25/30)
Definition statement

This place covers:

X-ray imaging systems that directly or indirectly detect incident X-ray photons.

References
Limiting references

This place does not cover:

Cameras or camera modules for generating image signals from X-rays

H04N 23/30

Circuitry of solid-state image sensors [SSIS] for transforming X-rays into image signals

H04N 25/30

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment

A61B 6/00

Measuring length, thickness or similar linear dimensions, angles, areas or irregularities of surfaces or contours, using X-rays

G01B 15/00

Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays, by transmitting the radiation through the material and forming a picture

G01N 23/04

Informative references

Attention is drawn to the following places, which may be of interest for search:

Measurement performed on radiation beams, e.g. position or section of the beam; Measurement of spatial distribution of radiation

G01T 1/29

Photographic processes for X-rays, infrared or ultraviolet light

G03C 5/16

Electrographic processes using X-rays, e.g. electroradiography

G03G 13/054

Apparatus for electrographic processes using X-rays, e.g. electroradiography

G03G 15/054

Transforming infrared radiation (cameras or camera modules for generating image signals from infrared radiation H04N 23/20; circuitry of SSIS for transforming infrared radiation into image signals H04N 25/20)
Definition statement

This place covers:

Image sensors other than solid state image sensors and control thereof for near and far infrared [IR] cameras, for example pyroelectric imaging tubes or image intensifier tubes.

References
Limiting references

This place does not cover:

Cameras or camera modules for generating image signals from infrared radiation

H04N 23/20

Circuitry of solid-state image sensors [SSIS] for transforming only infrared radiation into image signals

H04N 25/20

Informative references

Attention is drawn to the following places, which may be of interest for search:

Radiation pyrometry, e.g. infrared or optical thermometry

G01J 5/00

Investigating or analysing materials using infrared light

G01N 21/35

Photographic processes for X-ray, infrared or ultraviolet ray

G03C 5/16

Organic devices sensitive to infrared radiation

H10K 30/00

Thermoelectric devices comprising a junction of dissimilar materials

H10N 10/00

Thermoelectric devices without a junction of dissimilar materials

H10N 15/00

Transmitter circuitry {for the transmission of television signals according to analogue transmission standards} (H04N 5/14 takes precedence)
References
Limiting references

This place does not cover:

Picture signal circuitry for video frequency region

H04N 5/14

Informative references

Attention is drawn to the following places, which may be of interest for search:

Characteristics or internal components of servers

H04N 21/226

Transmitter circuits per se

H04B 1/04

Modulation circuits
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Digital communication modulator circuits

H01L 27/04, H01L 27/12, H10N 39/00

Modulation per se

H03C

Receiver circuitry {for the reception of television signals according to analogue transmission standards} (H04N 5/14 takes precedence)
References
Limiting references

This place does not cover:

Picture signal circuitry for video frequency region

H04N 5/14

Informative references

Attention is drawn to the following places, which may be of interest for search:

Characteristics or internal components of clients

H04N 21/426

Receiver circuits per se

H04B 1/16

{Circuit details of the additional information generator, e.g. details of the character or graphics signal generator, overlay mixing circuits}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Characteristics of or internal components of the client for processing graphics

H04N 21/42653

Generation of individual character patterns for visual indicators

G09G 5/24

Graphics pattern generators for visual indicators

G09G 5/36

Picture in picture {, e.g. displaying simultaneously another television channel in a region of the screen}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Displaying supplemental content in a region of the screen

H04N 21/4316

Demodulation-circuits
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Decoding digital information by demodulating in clients

H04N 21/42676

Digital communication demodulator circuits

H01L 27/06, H01L 27/14, H10B 61/00

Demodulation per se

H03D

for receiving on more than one standard at will (deflecting circuits of multi-standard receivers H04N 3/27)
References
Limiting references

This place does not cover:

Deflecting circuits of multi-standard receivers

H04N 3/27

Tuning indicators; Automatic tuning control
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Accessing communication channels in clients, tuning

H04N 21/4383

Tuning resonant circuits per se

H03J

Automatic frequency control per se

H03J 7/02

{Invisible or silent tuning}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Muting the audio signal in clients

H04N 21/4396

Automatic gain control
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Automatic gain control in amplifiers per se

H03G 3/20

for the sound signals
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Invisible or silent tuning

H04N 5/505

Processing of audio elementary streams in clients

H04N 21/439

Generation or supply of power specially adapted for television receivers
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Generation of supply voltages, in combination with electron beam deflecting

H04N 3/18

Informative references

Attention is drawn to the following places, which may be of interest for search:

Power management in clients

H04N 21/4436

Regulating of voltage or current in general

G05F

Transformers

H01F 29/00; H01F 30/00

Supplying or distributing electric power, in general

H02J

Static converters

H02M

Constructional details of receivers, e.g. cabinets or dust covers (furniture aspects {of television cabinets} A47B 81/06)
References
Limiting references

This place does not cover:

Furniture aspects of television cabinets

A47B 81/06

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Combinations of a television receiver with apparatus having a different main function

H05K 11/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Constructional details of communication receivers, in general

H04B 1/08

Casings, cabinets or drawers for electric apparatus, in general

H05K 5/00

Transforming electric information into light information (scanning details H04N 3/00)
References
Limiting references

This place does not cover:

Scanning details of television systems

H04N 3/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

electro- or magneto optic devices

G02F 1/00

CRT's

H01J

Circuit details for cathode-ray display tubes {(deviation circuits H04N 3/16)}
Definition statement

This place covers:

Circuit details of cathode-ray display tubes pertaining to the conversion of electrical information into light information, when specially adapted for television displays.

References
Limiting references

This place does not cover:

Scanning in television systems by deflecting electron beam in cathode-ray tube

H04N 3/16

Informative references

Attention is drawn to the following places, which may be of interest for search:

Control arrangements or circuits, of interest only in connection with cathode-ray tube indicators

G09G 1/00

Generating pulses having essentially a finite slope or stepped portion

H03K 4/00

Circuit details for electroluminescent devices
Definition statement

This place covers:

Circuit details of the conversion of electric information into light information in electroluminescent television displays.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Control arrangements or circuits using electroluminescent elements for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements

G09G 3/12

Control arrangements or circuits using electroluminescent panels for presentation of an assembly of a number of characters, by composing the assembly by combination of individual elements arranged in a matrix

G09G 3/30

Modifying the appearance of television pictures by optical filters or diffusing screens
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Diffusing elements per se

G02B 5/02

Optical filters per se

G02B 5/20

Projection arrangements for image reproduction, e.g. using eidophor
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Projection devices for colour picture display

H04N 9/31

Television signal recording
Definition statement

This place covers:

Video data recording:

  • Specially adapted recording devices such as a VCR, PVR, high speed camera, camcorder or a specially adapted PC
  • Interfaces between recording devices and other devices for input and/or output of video signals such as TVs, video cameras, other recording devices
  • Video recorder programming
  • Adaptations of the video signal for recording on specific recording media such as HDD, tape, drums, holographic support, semiconductor memories
  • Adaptations for reproducing at a rate different from the recording rate such as trick play modes and stroboscopic recording
  • Processing of the video signal for noise suppression, scrambling, field or frame skip, bandwidth reduction
  • Impairing the picking up, for recording, of a projected video signal
  • Regeneration of either a recorded video signal or for recording the video signal
  • Video signal recording wherein the recorded video signal may be accompanied by none, one or more video signals (stereoscopic signals or video signals corresponding to different story lines)
  • Production of a motion picture film from a television signal

Details specific to this group:

Relationships with other classification places
  • The subject-matter in the range H04N 5/92 - H04N 5/956 deals with recording and processing for recording of only black and white video signals while H04N 9/79 - H04N 9/898 deals with recording and processing for recording colour video signals.
  • H04N 5/76 (video recording) distinguishes itself from editing, which is found in G11B 27/00, in that the signals recorded and reproduced are video signals.
  • H04N 5/76 is a function place for recording or processing for recording. H04N 21/433 describes applications for recording in a distribution system.
  • H04N 5/76 features recording devices specially adapted to video data recording that can be programmed. The programming may be done by a user or a using an algorithm. Business methods where the video recording feature or step is well known is generally classified in G06Q 30/02 .
  • H04N 5/76 contains recording devices that are characterised by the connection to other devices through an interface. Typically information is sent or received by a recorder through an interface that impacts the recording or playback function. Interfaces in general are found in H04N 5/44.
  • H04N 5/76 contains video cameras that record video data to a recording medium. Video cameras constructional details are found in H04N 23/00.
  • H04N 5/76 is an application place for video data trick play. Reproducing data in general at a rate different from the recording rate is found in G11B 27/005.
  • H04N 5/76 contains applications of video data processing for scrambling/encrypting video data for recording. Systems for rendering a video signal unintelligible are found in H04N 7/16 and H04N 21/00.
  • H04N 5/76 is an application place for video data reduction for recording. Video data compression is found in H04N 19/00.
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Video surveillance

H04N 7/18

Selective content distribution

H04N 21/00

Controlling video cameras

H04N 23/60

Alarm system using video cameras

G08B 13/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Production of a video signal from a motion picture film

H04N 5/253

Interfaces

H04N 5/44

Video data coding

H04N 19/00

Network video distribution

H04N 21/236

Personal video recorder in selective content distribution systems

H04N 21/4147

User interface of set top boxes

H04N 21/47

Video camera constructional details

H04N 23/00

Video data processing for printing

G03F 1/00

Systems for buying and selling, i.e. video content

G06Q 30/00

Business methods related to the distribution of video data content

G06Q 30/02

Information storage based on relative movement between record carrier and transducer

G11B

Video editing

G11B 27/034

Recording techniques specially adapted to a recording medium for recording digital data in general

G11B 27/10

Control of video recorders where the video signal is not substantially involved

G11B 31/00

Static stores

G11C

Special rules of classification

A document does not explicitly mention that the video signal is a monochrome video signal is to be interpreted as being a colour video signal. As a consequence some classes in H04N 5/76 specific to monochrome signal recording have fallen out of use. Instead the corresponding colour symbols should be given to such documents:

Allocation of CPC symbols:

  • A document containing invention information relating to video data recording will be given an H04N 5/76 CPC group.
  • A document containing additional information relating to video data recording (in particular, if the document discloses a detailed video recording device) will be given a H04N 5/76 Indexing Code symbol.
  • A document containing invention information for more than one invention it may be given more than one H04N 5/76 CPC group.
  • A document merely mentioning recording will not be given an CPC group, but it may receive an Indexing Code if the disclosure is considered relevant.

Allocation of Indexing Code symbols in combination with CPC:

  • When assigning H04N 5/76 as CPC group, giving an additional Indexing Code is mandatory.

Combined use of Indexing Code symbols:

  • Indexing Code symbols maybe allocated as necessary to describe additional information in document.

Symbol allocation rules:

  • Documents defining recording devices that have an interface, e.g., connected to a network, should have at least one of the more specific H04N 5/765 Indexing Code symbols.
  • Documents dealing with invention information about measures to prevent recording of projected images should be given the H04N 2005/91392 Indexing Code symbol.
Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Video or video data

Video signal, analogue or digital, with or without accompanying audio

Interface circuits between an apparatus for recording and another apparatus
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements for the associated working of recording or reproducing apparatus with related apparatus

G11B 31/00

{the recording apparatus and the television camera being placed in the same enclosure}
Definition statement

This place covers:

Video cameras as recording devices.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Television cameras

H04N 23/00

between a recording apparatus and a television receiver
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

TV-receiver details

H04N 5/44

Recording/reproduction devices integrated in TV-receivers

H04N 5/445

Synchronisation between a display unit and video-disc players

G09G 5/12

on disks or drums
Definition statement

This place covers:

Magnetic disks.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording on, or reproducing or erasing from, magnetic drums

G11B 5/004

Recording on, or reproducing or erasing from, magnetic disks

G11B 5/012

Magnetic drum carriers

G11B 5/76

Magnetic disk carriers

G11B 5/82

on tape
Definition statement

This place covers:

Video recording programming applications, although it reads (recording) "on tape".

Video recorder programming (reservation recording).

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording on, or reproducing or erasing from, magnetic tapes

G11B 5/008

Magnetic tape carriers

G11B 5/78

Arrangements for device control affected by the broadcast information

H04H 60/13

with stationary magnetic heads
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Fixed mountings of heads relative to magnetic record carriers

G11B 5/49

with rotating magnetic heads
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Disposition or mounting of heads on rotating support

G11B 5/53

Adaptations for reproducing at a rate different from the recording rate
Definition statement

This place covers:

  • Trick play modes as well as processing for recording to enable the reproduction of video data at a rate different from the recording rate.
  • High speed recording cameras.
  • Speed control during recording, reproducing, reproducing at variable speed.
using electrostatic recording (H04N 5/91 takes precedence)
References
Limiting references

This place does not cover:

Television signal processing recording

H04N 5/91

Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording or reproducing using electrostatic charge injection; record carriers therefor

G11B 9/08

using optical recording (H04N 5/80, H04N 5/89, H04N 5/91 take precedence)
References
Limiting references

This place does not cover:

Television signal recording using electrostatic recording

H04N 5/80

Television signal recording using holographic recording

H04N 5/89

Television signal processing for recording

H04N 5/91

Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording or reproducing by optical means; record carriers therefor

G11B 7/00

on discs or drums
Definition statement

This place covers:

Optical discs.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording or reproducing by optical means with cylinders

G11B 7/0025

Recording or reproducing by optical means with disks

G11B 7/0037

using holographic recording (H04N 5/91 takes precedence)
References
Limiting references

This place does not cover:

Television signal processing recording

H04N 5/91

Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording, reproducing or erasing by using optical interference patterns, e.g. holograms

G11B 7/0065

using variable electrical capacitive recording (H04N 5/91 takes precedence)
References
Limiting references

This place does not cover:

Television signal processing recording

H04N 5/91

Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording or reproducing using record carriers having variable electrical capacitance; Record carriers therefor

G11B 9/06

using static stores, e.g. storage tubes or semiconductor memories (H04N 5/91 takes precedence)
References
Limiting references

This place does not cover:

Television signal processing recording

H04N 5/91

Informative references

Attention is drawn to the following places, which may be of interest for search:

Television signal recording based on relative movement between record carrier and transducer

H04N 5/78 -H04N 5/903

Static stores per se

G11C

Television signal processing therefor
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Processing of colour television signals in connection with recording

H04N 9/79

for the suppression of noise {(H04N 5/932 takes precedence)}
References
Limiting references

This place does not cover:

Regeneration of analogue synchronisation signals

H04N 5/932

Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for suppressing or minimizing disturbance in television systems in general

H04N 5/21

for scrambling {; for copy protection}
Definition statement

This place covers:

  • Scrambling and encryption of video data for recording.
  • Copy-protection systems.
Special rules of classification

At least one Indexing Code H04N 5/913 symbol should be allocated to such document to further specify the scrambling method.

for bandwidth reduction
Definition statement

This place covers:

Compression of analogue video signals.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transformation of the television signal for recording by pulse code modulation

H04N 5/926, H04N 9/804

Systems in which the television signal is transmitted via one channel or a plurality of parallel channels, the bandwidth of each channel being less than the bandwidth of the television signal

H04N 7/12

Systems for the transmission of television signals using pulse code modulation

H04N 7/24

Methods or arrangements for coding, decoding, compressing or decompressing digital video signals

H04N 19/00

by dividing samples or signal segments, e.g. television lines, among a plurality of recording channels
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Processing of colour television signals for recording the signal in a plurality of channels, the bandwidth of each channel being less than the bandwidth of the signal

H04N 9/797

Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Transformation of the colour television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback

H04N 9/80

Informative references

Attention is drawn to the following places, which may be of interest for search:

Transmitter circuitry

H04N 5/38

Receiver circuitry

H04N 5/44

Modulation

H03C

Demodulation or transference of modulation from one carrier to another

H03D

Special rules of classification

The corresponding colour symbol should be allocated: H04N 9/82

by pulse code modulation (H04N 5/919 takes precedence)
References
Limiting references

This place does not cover:

Television signal processing for recording for bandwidth reduction by dividing samples or signal segments

H04N 5/919

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Transformation of the television signal for recording involving pulse code modulation of the colour picture signal components

H04N 9/804

Transformation of the television signal for recording involving pulse code modulation of the composite colour video-signal

H04N 9/808

the sound signal being pulse code modulated and recorded in time division multiplex with the modulated video signal
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Transformation of the television signal for recording involving pulse code modulation of the colour picture signal components with processing of the sound signal

H04N 9/806

Regeneration of the television signal or of selected parts thereof
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Regeneration of colour television signals by assembling picture element blocks in an intermediate memory

H04N 9/87

Special rules of classification

The corresponding colour symbol should be allocated: H04N 9/87

by assembling picture element blocks in an intermediate store
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Regeneration of colour television signals

H04N 9/877

Signal drop-out compensation
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Signal drop-out compensation in the regeneration of colour television signals

H04N 9/88

Special rules of classification

The corresponding colour symbol should be allocated: H04N 9/88

for signals recorded by pulse code modulation
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Signal drop-out compensation for signals recorded by pulse code modulation in the regeneration of colour television signals

H04N 9/888

Informative references

Attention is drawn to the following places, which may be of interest for search:

Error detection or correction of digital signals for recording in general

G11B 20/18

Time-base error compensation {(H04N 5/932 takes precedence)}
References
Limiting references

This place does not cover:

Regeneration of analogue synchronisation signals

H04N 5/932

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Time-base error compensation in the regeneration of colour television signals

H04N 9/89

Special rules of classification

The corresponding colour symbol should be allocated: H04N 9/89

by using an analogue memory, e.g. a CCD shift register, the delay of which is controlled by a voltage controlled oscillator
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Time-base error compensation using an analog memory, e.g. a CCD-shift register, the delay of which is controlled by a voltage controlled oscillator, in the regeneration of colour television signals

H04N 9/893

by using a digital memory with independent write-in and read-out clock generators
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Time-base error compensation using a digital memory with independent write-in and read-out clock generators, in the regeneration of colour television signals

H04N 9/898

Television systems (details H04N 3/00, H04N 5/00; methods or arrangements, for coding, decoding, compressing or decompressing digital video signals H04N 19/00; selective content distribution H04N 21/00)
Definition statement

This place covers:

  • structural or hardware-related aspects of television systems, involving
  • analogue television signals or digital television signals processed at low level (e.g. physical layer in the OSI model);
  • details on conversion of television standards;
  • circuits for recovering digital non-picture data in analogue television signals;
  • specific arrangements allowing transmission of television signals via electric cables, optical fibres or using a GHz frequency band.
Relationships with other classification places

H04N 5/00 covers details of television systems and circuitry for processing analogue television signals or digital television signals processed at pixel level. Conversion between television standards and circuits for recovering digital non-picture data (slicers) are however classified in H04N 7/00.

H04N 9/00 and H04N 11/00 are to be considered when the focus is on colour aspects.

Aspects of diagnosis, testing and measuring for television systems are covered by H04N 17/00.

Television systems involving digital television signals not processed at low level should normally be classified in H04N 21/00.

Broadcast systems which are not specifically adapted for television signals should be classified in H04H.

Systems foreseen for the transmission/reception of data which may comprises inter alia television or video signals should be classified in respective telecommunication areas H04B, H04L, H04M and H04W.

General image processing not specific to television signals belongs to G06T. Video signal processing specific to visual displays belongs to G09G.

References
Limiting references

This place does not cover:

Scanning details of television systems; Combination thereof with generation of supply voltages

H04N 3/00

Generation of supply voltages, in combination with electron beam deflecting

H04N 3/18

Details of television systems

H04N 5/00

Details of systems specific to colour television

H04N 9/00

Methods or arrangements for coding, decoding, compressing or decompressing digital video signals

H04N 19/00

Selective content distribution

H04N 21/00

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Colour television systems

H04N 11/00

Stereoscopic television systems

H04N 13/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes

A61B 1/00

Aspects of video games

A63F 13/00

Arrangements in vehicles for holding or mounting or controlling radio sets, television sets, telephones, or the like

B60R 11/02

Mounting of cameras operative during drive (of a vehicle)

B60R 11/04

Arrangements for entertainment or communications for passenger or crew in aircraft, e.g. radio, television

B64D 11/0015

Recognition of data in general

G06K

Commerce, e.g. shopping or e-commerce

G06Q 30/00

Image data processing or generation, in general

G06T

Remote control devices

G08C 17/00

Electrically-operated teaching apparatus or devices working with questions and answers

G09B 7/00

Simulators for teaching or training purposes

G09B 9/00

Miscellaneous advertising or display means not provided for elsewhere

G09F 19/00

Combined visual and audible advertising or displaying, e.g. for public address

G09F 27/00

Broadcast communication in general

H04H

Wireless networks in general

H04W

Special rules of classification

A document containing invention information relating to one of the subgroups will be given the relating CPC symbol.

A document containing additional information relating to one of the subgroups will be given the relating Indexing Code.

A document merely mentioning a television system will not be given aCPC symbol, but it may receive an Indexing Code if the disclosure is considered relevant.

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

3:2 pull-down pattern

a pattern of images where the first image is repeated 3 times and the second image is repeated twice

HDTV

High Definition TeleVision

ISDN

Integrated Services Digital Network (ISDN): a circuit-switched telephone network system

Letter-box system

television system which displays images comprising a central part and black bars above and below the central part

MAC

Multiplexed Analogue Components (MAC): a satellite television transmission standard

Video signal

television signal

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

CRT

Cathode Ray Tube, a technology of display

CATV

Community Antenna Television

CCTV

Closed Circuit TeleVision

EPG

Electronic Programme Guide

GUI

Graphics User Interface

MATV

Master Antenna TeleVision

MPEG

Motion Picture Experts Group; a family of standards used for coding audio-visual information in a digital compressed format

PC

Personal Computer

PVR

Personal Video Recorder

STB

Set-Top Box

URL

Uniform Resource Locator

VOD

Video On Demand

{Special television systems not provided for by H04N 7/007 - H04N 7/18 (still pictures via a television channel H04N 1/00098)}
References
Limiting references

This place does not cover:

Transmission of still pictures via a television channel

H04N 1/00098

References out of a residual place

Examples of places in relation to which this place is residual:

Systems with supplementary picture signal insertion during a portion of the active part of a television signal, e.g. during top and bottom lines in a HDTV letter-box system

H04N 7/007

Conversion of standards

H04N 7/01

High-definition television systems

H04N 7/015

Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame

H04N 7/025

Systems for the transmission of one television signal, i.e. both picture and sound, by a single carrier

H04N 7/04

Systems for the simultaneous transmission of one television signal, i.e. both picture and sound, by more than one carrier

H04N 7/06

Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band

H04N 7/08

Adaptations for transmission by electric cable

H04N 7/10

Systems in which the television signal is transmitted via one channel or a plurality of parallel channels, the bandwidth of each channel being less than the bandwidth of the television signal

H04N 7/12

Systems for two-way working

H04N 7/14

Analogue secrecy systems; Analogue subscription systems

H04N 7/16

Closed circuit television systems, i.e. systems in which the signal is not broadcast

H04N 7/18

Adaptations for transmission via a GHz frequency band, e.g. via satellite

H04N 7/20

Adaptations for optical transmission

H04N 7/22

Systems for the transmission of television signals using pulse code modulation

H04N 7/24

{Systems with supplementary picture signal insertion during a portion of the active part of a television signal, e.g. during top and bottom lines in a HDTV letter-box system}
Definition statement

This place covers:

  • systems with auxiliary information data allowing improved picture quality transmitted during the active part of the TV signal, e.g. in black bands at the upper and lower edges of the picture;
  • letter-box systems.
Conversion of standards {, e.g. involving analogue television standards or digital television standards processed at pixel level}
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Conversion of colour television standards

H04N 11/20

Informative references

Attention is drawn to the following places, which may be of interest for search:

Studio circuits for television systems involving alteration of picture size or orientation

H04N 5/2628

Receiver circuitry for receiving on more than one standard at will

H04N 5/46

Saving bandwidth in low bit-rate video transmission

H04N 7/0115

Circuits specific for processing colour signals

H04N 9/64

Processing specific to video coder/decoder: subsampling at the coder and/or sample restitution by interpolation at the coder or decoder

H04N 19/00, H04N 19/59, H04N 19/587

Processing specific to video coder/decoder: transcoding to realise interoperability between different video coding standards

H04N 19/40

Image scaling in general

G06T 3/40

Video signal processing specific to visual displays

G09G

Adapting incoming signals to the display format of the display terminal

G09G 5/005

Frame rate conversion for reducing blurring effect in a hold-type liquid crystal display (LCD)

G09G 2320/0257

Interpolation filters

H03H 17/0444, H03H 17/0657

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

FRC

Frame Rate Converter

FRUC

Frame Rate Up-Converter

MC-FRC

Motion Compensation - Frame Rate Converter

NTSC

National Television System Committee

PAL

Phase alternating line

SECAM

Séquentiel couleur à mémoire (Sequential Colour with Memory)

High-definition television systems
Definition statement

This place covers:

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

High-definition colour television systems

H04N 11/24

Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame {(transmission of digital non-picture data during the vertical blanking interval only H04N 7/088)}
Definition statement

This place covers:

Systems using the active part of a television signal or part of it for transmitting digital non-picture data not foreseen to be watched as such on a display.

References
Limiting references

This place does not cover:

Transmission of still pictures via a television channel

H04N 1/00098

Transmission of digital non-picture data during the vertical blanking interval

H04N 7/088

Circuits for the digital non-picture data signal, e.g. for slicing of the data signal, for regeneration of the data-clock signal, for error detection or correction of the data signal
Definition statement

This place covers:

  • Circuits for recovering data transmitted during the non-active part of the television signal, e.g. vertical or horizontal blanking interval;
  • Circuits for recovering data transmitted during the active part of the television signal instead of the pictorial signal.
Systems for the transmission of one television signal, i.e. both picture and sound, by a single carrier {(H04N 7/084, H04N 7/087 take precedence)}
Definition statement

This place covers:

Systems for transmitting in a particular way both picture and sound by a single carrier.

References
Limiting references

This place does not cover:

Systems for the transmission of more than one television signal of with signal insertion during the horizontal blanking interval

H04N 7/084

Systems for the transmission of more than one television signal of with signal insertion during the vertical blanking interval

H04N 7/087

Systems for the simultaneous transmission of one television signal, i.e. both picture and sound, by more than one carrier {(H04N 7/084, H04N 7/087 take precedence)}
Definition statement

This place covers:

Systems for transmitting in a particular way both picture and sound by more than one carrier.

References
Limiting references

This place does not cover:

Systems for the transmission of more than one television signal of with signal insertion during the horizontal blanking interval

H04N 7/084

Systems for the transmission of more than one television signal of with signal insertion during the vertical blanking interval

H04N 7/087

Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band {, e.g. by time division (H04N 7/007 takes precedence)}
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Systems with supplementary picture signal insertion during a portion of the active part of a television signal, e.g. during top and bottom lines in a HDTV letter-box system

H04N 7/007

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

CC

Closed Caption

CRI

Clock Run-In

HBI

Horizontal Blanking Interval

RIC

Run-In Clock

VBI

Vertical Blanking Interval

Adaptations for transmission by electrical cable (H04N 7/12 takes precedence)
References
Limiting references

This place does not cover:

Systems in which the television signal is transmitted via one channel or a plurality of parallel channels, the bandwidth of each channel being less than the bandwidth of the television signal

H04N 7/12

Informative references

Attention is drawn to the following places, which may be of interest for search:

Coaxial connectors for coaxial cables

H01R 24/40

Networks for connecting several sources or loads, working on different frequencies or frequency bands, to a common load or source, particularly adapted for use in common antenna systems

H03H 7/461

Networks for connecting several sources or loads, working on the same frequency or frequency band, to a common load or source, particularly adapted for use in common antenna systems

H03H 7/482

Line transmission systems, in general

H04B 3/00

Repeater circuits for signals in two different frequency ranges transmitted in opposite directions over the same transmission path

H04B 3/38

Arrangements of wired systems for broadcast

H04H 20/76, H04H 60/93

CATV systems

H04H 20/78

Home automation networks

H04L 12/2803

Distribution of signals within a home automation network, e.g. involving splitting/multiplexing signals to/from different paths

H04L 12/2838

Systems in which the television signal is transmitted via one channel or a plurality of parallel channels, the bandwidth of each channel being less than the bandwidth of the television signal (H04N 7/24 takes precedence)
References
Limiting references

This place does not cover:

Systems for the transmission of television signals using pulse code modulation

H04N 7/24

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Colour television systems with bandwidth reduction

H04N 11/02

Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning details of television systems

H04N 3/00

High-definition television systems

H04N 7/015

Systems for two-way working ({H04N 7/12, } H04N 7/173 take precedence)
References
Limiting references

This place does not cover:

Systems in which the television signal is transmitted via one channel or a plurality of parallel channels, the bandwidth of each channel being less than the bandwidth of the television signal

H04N 7/12

Analogue secrecy or subscription systems with two-way working, e.g. subscriber sending a programme selection signal

H04N 7/173

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Client devices specifically adapted for the reception of, or interaction with, content, e.g. STB [set-to-box]; Operations thereof

H04N 21/40

Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems for two-way working in the scanning, transmission or reproduction of documents or the like

H04N 1/42

Telephonic communication systems combined with television receiver for reception of entertainment or information matter

H04M 11/085

Conference systems
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems for two-way working between two video terminals, e.g. videophone

H04N 7/141

Arrangements for conference in data switching networks

H04L 12/18, H04L 12/1813

Telephonic conference arrangements

H04M 3/56

Multimedia conference systems

H04M 3/567

Analogue secrecy systems; Analogue subscription systems
Definition statement

This place covers:

Television systems, where transmitters such as head-ends distribute analog television signals to television receivers. Typically, the access to analog television information is restricted based on a subscription system, where the television viewer will be charged for accessing the programs he/she has selected. To prevent eavesdropping, the transmitted analog signals are scrambled by the transmitter, e.g. in the time or amplitude domain, and descrambled at reception. Such systems can work in a unidirectional mode, where the transmitter decides which analog television programs the subscriber is entitled to view or in a bidirectional mode, where the user can request to view a movie.

Relationships with other classification places

Unidirectional or bidirectional television systems involving the distribution of digital video signals fall within the scope of H04N 21/00.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Secrecy systems in the scanning, transmission or reproduction of documents or the like

H04N 1/44

Secret communication

H04K

{Constructional details of the subscriber equipment (H04N 7/164 takes precedence)}
References
Limiting references

This place does not cover:

Coin-freed apparatus

H04N 7/164

Informative references

Attention is drawn to the following places, which may be of interest for search:

Coin-freed and like apparatus in general

G07F

{Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing}
Definition statement

This place covers:

  • Entitlement systems, where the receiver is entitled to access the analog television program. Usually the user is billed therefor
  • Analog conditional access systems
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Payment schemes

G06Q 20/00

E-commerce

G06Q 30/00

{by receiver means only}
Definition statement

This place covers:

Television programs are broadcast in a scrambled form and only receivers fitted with e.g. a conditional access card can descramble them.

{Centralised control of user terminal (subsequent to an upstream request signal H04N 7/17345); Registering at central (by two-way working H04N 7/17309)}
Definition statement

This place covers:

Systems where typically head-ends select which receivers are entitled to receive the analog television programs.

References
Limiting references

This place does not cover:

Registering at central by two-way working

H04N 7/17309

Centralised control of user terminal subsequent to an upstream request signal

H04N 7/17345

Systems rendering the television signal unintelligible and subsequently intelligible
Definition statement

This place covers:

  • Systems operating in the time domain, e.g. by displacing synchronisation signals relative to active picture signals or vice versa or by changing or reversing the order of active picture signal portions
  • Systems operating in the amplitude domain, e.g. by modifying synchronisation signals or by inverting the polarity of active picture signal portions
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Secret communication by adding a second signal to make the desired signal unintelligible

H04K 1/02

{Providing digital key or authorisation information for generation or regeneration of the scrambling sequence (pseudo-random number generators in general G06F 7/58)}
Definition statement

This place covers:

Scrambling or descrambling of analog television signals based on digital keys

References
Limiting references

This place does not cover:

Pseudo-random number generators in general

G06F 7/58

Computer security

G06F 21/00

Cryptography in general

H04L 9/00

Network security

H04L 63/00

with two-way working, e.g. subscriber sending a programme selection signal
Definition statement

This place covers:

Bidirectional systems

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems for two-way working in the scanning, transmission or reproduction of documents or the like

H04N 1/42

Client devices specifically adapted for the reception of, or interaction with, content, e.g. STB [set-top-box]; Operations thereof

H04N 21/40

{Transmission or handling of upstream communications}
Definition statement

This place covers:

Details of analog signal processing, coding or modulating in the upstream channel

{Direct or substantially direct transmission and handling of requests}
Definition statement

This place covers:

Typically on-demand systems for analog TV programs

{Handling of requests in head-ends}
Definition statement

This place covers:

Details of analog Video-on-Demand servers

Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Signal generation from motion picture films

H04N 5/253

Instruments for performing medical examinations of the interior of cavities or tubes of the body combined with television appliances

A61B 1/04

Real time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles

B60R 1/00

Arrangements in vehicles for holding or mounting or controlling radio sets, television sets, telephones, or the like

B60R 11/02

Mounting of cameras operative during drive of a vehicle; Arrangements of control thereof relative to the vehicle

B60R 11/04

Arrangements for entertainment or communications for passenger or crew in aircraft, e.g. radio, television

B64D 11/0015

Scanning a visible indication of a measured value and reproducing this indication at a remote place, e.g. on the screen of a cathode-ray tube

G01D 5/39

Recognition of data in general

G06K

Image processing in general

G06T

Burglar, theft, or intruder alarms using television cameras

G08B 13/196

Adaptations for transmission via a GHz frequency band, e.g. via satellite
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Space-based or airborne stations for radio transmission systems

H04B 7/185

Arrangements of satellite networks for broadcast

H04H 20/74

Adaptations for optical transmission
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transmission systems employing electromagnetic waves other than radio waves, e.g. light

H04B 10/00

Arrangements of optical systems for broadcast

H04H 20/69

Systems for the transmission of television signals using pulse code modulation (H04N 21/00 takes precedence)
References
Limiting references

This place does not cover:

Source coding or decoding of a digital video signal

H04N 19/00

Error protection or correction of a digital video signal

H04N 19/89

Selective content distribution, e.g. interactive television

H04N 21/00

Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Colour television systems using pulse code modulation

H04N 11/04

Informative references

Attention is drawn to the following places, which may be of interest for search:

Transmission systems using pulse code modulation, in general

H04B 14/04

Special rules of classification

H04N 21/00 takes precedence, except for source coding or decoding of a digital video signal (H04N 19/00 takes precedence in this case), and for error protection, detection or correction of a digital video signal (H04N 19/89 takes precedence in this case):

Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal (assembling of a multiplex stream by combining a video stream with other content or additional data, remultiplexing of multiplex streams, insertion of stuffing bits into the multiplex stream, assembling of a packetised elementary stream at server side H04N 21/236; disassembling of a multiplex stream, remultiplexing of multiplex streams, extraction or processing of Service Information, disassembling of packetised elementary stream at client side H04N 21/434)
References
Limiting references

This place does not cover:

Assembling of a multiplex stream, by combining a video stream with other content or additional data, remultiplexing of multiplex streams, insertion of stuffing bits into the multiplex stream, assembling of a packetised elementary stream at server side

H04N 21/236

Disassembling of a multiplex stream, remultiplexing of multiplex streams, extraction or processing of Service Information, disassembling of packetised elementary stream at client side

H04N 21/434

Definition statement

This place covers:

older multiplexing/demultiplexing and transport technologies which were used before the introduction of MPEG system layer, based on a format, e.g. a frame format, usable for transmission or recording of compressed or uncompressed video data, possibly combined with other content, e.g. audio

References
Limiting references

This place does not cover:

Multiplexing/demultiplexing of asynchronous signals, e.g. MPEG system layer type signals, involving the use of transport streams, program streams

H04N 21/236, H04N 21/434

Use of PCR for clock recovery

H04N 21/242, H04N 21/4305

Use of time stamps (PTS, DTS) for content synchronisation

H04N 21/242, H04N 21/4307

Special rules of classification

Multiplexing/demultiplexing video and audio: H04N 21/2368, H04N 21/4341 take precedence;

multiplexing/demultiplexing video and additional data: H04N 21/23614,

H04N 21/4348 take precedence;

multiplexing/demultiplexing several video streams: H04N 21/2365, H04N 21/4347 take precedence;

multiplexing/demultiplexing isochronously with video sync, e.g. according to bit-parallel or bit-serial interface formats, as SDI: H04N 21/23602, H04N 21/4342 take precedence.

Synchronising systems therefor
Definition statement

This place covers:

Synchronisation for signals falling under H04N 7/54

References
Limiting references

This place does not cover:

Use of PCR for clock recovery

H04N 21/242, H04N 21/4305

Use of time stamps (PTS, DTS) for content synchronisation

H04N 21/242, H04N 21/4307

Informative references

Attention is drawn to the following places, which may be of interest for search:

Synchronising of television systems, in general

H04N 5/04

Synchronisation of generators of electronic oscillations or pulses

H03L 7/00

Arrangements for synchronising receiver with transmitter in the transmission of digital information

H04L 7/00

Details of colour television systems
Definition statement

This place covers:

  • Picture signal generators
  • Picture reproducers using opto-mechanical scanning, cathode-ray tubes, solid-state colour displays or projection devices
  • Conversion of monochrome to colour image signals
  • Colour synchronisation
  • Processing brightness and chrominance signal in relation with each other
  • Processing of colour signals in general as well as specifically for recording
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Details of stereoscopic colour television systems

H04N 13/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning details of television systems, in general

H04N 3/00

Details of television systems, in general

H04N 5/00

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

ESLM

Electronic Spatial Light Modulator

DMD

Deformable mirror device

LCLV

Liquid Crystal Light Valve

D-ILA

Direct Drive Image Light Amplifier

HDR

High Dynamic Range

LCOS

Liquid Crystal On Silicon

DSP

Digital Signal Processor

DLP

Digital Light Processor

CRT

Cathode Ray Tube

RGB

Red Green Blue

CYM

Cyan Yellow Magenta

Scanning of colour motion picture films, e.g. for telecine
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of scanning of motion picture films, e.g. for telecine, applicable to television systems in general

H04N 3/36

Picture signal generating by scanning motion picture or slide opaques, e.g. for telecine

H04N 5/253

Picture reproducers (H04N 9/11 takes precedence)
Definition statement

This place covers:

Video walls (excluding multi-projection displays)

References
Limiting references

This place does not cover:

Scanning of colour motion picture films, e.g. for telecine

H04N 9/11

Video walls or multiscreen displays when each modular display is a projection device.

H04N 9/3147

Informative references

Attention is drawn to the following places, which may be of interest for search:

Devices or arrangements for the electro-, magneto- or acousto-optical modulation or deflection of light beams

G02F

Control arrangements or circuits for visual indicators characterised by the way in which colour is displayed, common to cathode-ray tubes and other visual indicators

G09G 5/02

using optical-mechanical scanning means only
References
Limiting references

This place does not cover:

Scanning of colour motion picture films, e.g. for telecine

H04N 9/11

Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning by optical-mechanical means only, applicable to television systems in general

H04N 3/02

using cathode ray tubes (H04N 9/11 takes precedence)
References
Limiting references

This place does not cover:

Scanning of colour motion picture films

H04N 9/11

Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning by deflecting electron beam in cathode-ray tube, applicable to television systems in general

H04N 3/16

Control arrangements or circuits using colour cathode-ray tube indicators

G09G 1/28

Cathode-ray tubes per se

H01J 31/00

Arrangements for convergence or focusing
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Modification of scanning arrangements to improve focusing in cathode-ray tubes, applicable to television systems in general

H04N 3/26

using demagnetisation or compensation of external magnetic fields
Definition statement

This place covers:

Details of degaussing circuits.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of CRT or electron-beam tubes

H01J 29/003

using solid-state colour display devices
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning by means of electrically scanned solid-state devices, applicable to picture reproduction in television systems in general

H04N 3/14

Indicating devices using static means to present variable information

G09G

Projection devices for colour picture display {, e.g. using electronic spatial light modulators [ESLM]}
Definition statement

This place covers:

  • Image projection using an electronic spatial light modulator [ESLM], i.e. processing of electrical image signals provided to the ESLM for the generation of projector control signals, for controlling the ESLM, e.g. control of the light source
  • based on electronic image signal, light conditioning specially adapted for the ESLM
  • in-projector image processing, electronic image data manipulation, e.g. during display or projection
  • details of projectors peculiar to the use of an ESLM, e.g. dichroic
  • and polarizing arrangements specially adapted for the ESLM
  • remote control of projectors peculiar to the ESLM, e.g. affecting their operation, or based on a generated image signal;
  • adaptations peculiar to the use of an ESLM and/or the display, the transmission, recording or other use of electrical image data
  • and related circuitry, e.g. mounting of ESLM, integrated
Relationships with other classification places

Subclass G03B contains subject-matter relating to the following aspects:

  • Aspects of apparatus/methods for projecting or viewing images using an electronic spatial light modulator [ESLM], insofar as they correspond to those of said apparatus/methods for projecting or viewing images using film stock, photographic film or slides, i.e. insofar as not peculiar to the presence of the ESLM, e.g. mounting of optical elements not peculiar to the presence of the ESLM, and their related controls not peculiar to the presence of the ESLM, e.g. cooling, beam shaping, optical keystone correction;
  • (opto-)mechanical image enhancement in printers or projectors (e.g. keystone correction);
  • constructional aspects of projectors, e.g. cooling, beam shaping, light
  • integrating means not peculiar to the ESLM;

Subclass G02B contains subject-matter relating to the following aspects:

  • Optical image modulation using direction light control e.g. deformable mirror devices (DMD's),
  • laser speckle optics,
  • head-up projection displays (head-mounted displays).

Subclass G02F contains subject-matter relating to the following aspects:

  • Control of light using liquid crystals.

Subclass G09F 9/00 contains subject-matter relating to the following aspects:

  • Indicating arrangements for variable information (e.g. street or stadium displays).
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Image reproducers

H04N 13/30

Informative references

Attention is drawn to the following places, which may be of interest for search:

Projection arrangements for image reproduction e.g. using eidophor

H04N 5/74 - H04N 5/7491

Optical systems in general

G02B

Devices for controlling direction of light e.g. DMD's

G02B 26/08

Head-up displays

G02B 27/01

Speckle reduction

G02B 27/48

Light control

G02F

Film projection and photography

G03B

Projection devices using film stock, photographic film or slides

G03B 21/00

Details of film projectors

G03B 21/14

Projection screens

G03B 21/56

Image processing per se

G06T

Displaying of variable information using colour tubes

G09G 1/28

Control of colour illumination sources

G09G 3/3413

Liquid crystal colour display with specific pixel layout

G09G 3/3607

Characterised by the way in which colour is displayed

G09G 5/02

Using circuits for interfacing with colour displays

G09G 5/04

Using colour palettes

G09G 5/06

{scanning a light beam on the display screen (scanning a light beam on a screen in displays other than projection devices G09G 3/02; scanning systems in general G02B 26/10; projectors using laser light sources in general H04N 9/3161)}
Definition statement

This place covers:

Scanning projection devices wherein a light beam (e.g. a point beam or a linear beam from a laser or an LED) is scanned across a screen (e.g. using scanning mirrors).

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Projectors using laser light sources in general

H04N 9/3161

XY Scanning, scanning systems in general

G02B 26/10

Laser speckle optics

G02B 27/48

Semiconductors lasers

H01S 5/00

Colour synchronisation
Definition statement

This place covers:

  • Synchronisation of the modulated colour signal in relationship with the colour subcarrier,
  • colour subcarrier generation in relationship with the extracted burst.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Synchronising, applicable to television systems in general

H04N 5/04

Synchronisation of pulses

H03K 4/90

Circuits for processing colour signals (H04N 9/77 takes precedence; camera processing pipelines for processing colour signals H04N 23/84)
Definition statement

This place covers:

  • Colour video sampling format conversion, e.g. 4:2:2 to 4:2:0
  • Gamut mapping and colour space conversions
  • Multiprimary colour signal conversion
  • Colour sampling in digital video, e.g. 4:4:4, 4:2:0, 4:1:1
  • Processing of the modulated or demodulated colour television signal
  • Input colour signal detection relating to the type and standard of colour signals
  • Synchronous modulation and demodulation of the colour signals
  • Image enhancement or disturbance suppression specific to the modulated or demodulated colour television signal
  • Colour space transformation of the demodulated colour signal
  • Amplitude control and gamma control of the modulated or demodulated colour television signal
  • DC control of the modulated colour television signal according to vertical blanking reference
  • White balance control of the demodulated colour signal for display 
  • Mixing of foreground and background colour video signals using chroma keying
Relationships with other classification places

With respect to colour or chrominance aspects, main group H04N 1/00 contains subject-matter relating to the following aspects:

  • Aspects of apparatus/methods for controlling or correcting colour video signals originating from a scanned picture signal, e.g. facsimile, document, photo.

Subclass G06T contains subject-matter relating to the following aspects:

  • General purpose data processing of an image or enhancement of such image not particularly adapted to a motion video signal.

Subclass H03D contains subject-matter relating to the following aspects:

  • Demodulation of amplitude modulated signals.

Demodulation circuits adapted to a particular standard are classified in:

References
Limiting references

This place does not cover:

Circuits for processing the brightness signal and the chrominance signal relative to each other

H04N 9/77

Camera processing pipelines for processing colour signals

H04N 23/84

Informative references

Attention is drawn to the following places, which may be of interest for search:

Colour picture communication system

H04N 1/46

Colour picture signal processing

H04N 1/56

Facsimile colour picture signal processing

H04N 1/60

Colour television signal testing

H04N 17/02

Image processing, image enhancement

G06T

Amplitude demodulation

H03D

{Multi-purpose receivers, e.g. for auxiliary information (H04N 9/642 takes precedence)}
Definition statement

This place covers:

Circuits for multiple input selection or for selecting a particular colour signal type.

References
Limiting references

This place does not cover:

Multi-standard receivers

H04N 9/642

{Multi-standard receivers}
Definition statement

This place covers:

Multistandard colour decoding circuits.

{Hue control means, e.g. flesh tone control}
Definition statement

This place covers:

  • Face detection circuits,
  • Hue control.
References
Limiting references

This place does not cover:

Acquiring or recognising human faces, facial parts, facial sketches, facial expressions

G06V 40/16

Informative references

Attention is drawn to the following places, which may be of interest for search:

Hue control relating to non moving picture signals

H04N 1/6075

for synchronous demodulators
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Demodulation circuits adapted to the NTSC standard

H04N 11/146

Demodulation circuits adapted to the PAL standard

H04N 11/165

Demodulation circuits adapted to the SECAM standard

H04N 11/186

for matrixing (camera processing pipelines for matrixing of colour signals H04N 23/85)
Definition statement

This place covers:

Colour space transformation circuits.

References
Limiting references

This place does not cover:

Camera processing pipelines for matrixing of colour signals

H04N 23/85

Informative references

Attention is drawn to the following places, which may be of interest for search:

Colour space transformation circuits relating to non-moving picture signals

H04N 1/6077

for controlling the amplitude of colour signals, e.g. automatic chroma control circuits (H04N 9/71, H04N 9/73 take precedence; camera processing pipelines for controlling the colour saturation of colour signals H04N 23/86)
References
Limiting references

This place does not cover:

Circuits for processing colour signals for colour killing combined with colour gain control

H04N 9/71

Colour balance circuits

H04N 9/73

Camera processing pipelines for controlling the colour saturation of colour signals

H04N 23/86

Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for controlling amplitude response, applicable to television systems in general

H04N 5/20

for reinsertion of DC and slowly varying components of colour signals (camera processing pipelines for reinsertion of DC or slowly varying components of colour signals H04N 23/87)
References
Limiting references

This place does not cover:

Camera processing pipelines for reinsertion of DC or slowly varying components of colour signals

H04N 23/87

Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for reinsertion of dc and slowly varying components of signals, applicable to television systems in general

H04N 5/16

Colour balance circuits, e.g. white balance circuits or colour temperature control (camera processing pipelines for colour balance H04N 23/88)
Definition statement

This place covers:

Colour balance control.

References
Limiting references

This place does not cover:

Camera processing pipelines for colour balance

H04N 23/88

Informative references

Attention is drawn to the following places, which may be of interest for search:

Color balance control relating to non moving picture signals

H04N 1/60

Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase (circuits for matrixing H04N 9/67)
Definition statement

This place covers:

  • Separation of luminance and chrominance signals from a multiplexed composite colour television signal
  • Processing of luminance and chrominance signals in relationship to each-other (differential gain, differential phase, luminance and chrominance correlated enhancement or noise suppression...)
References
Limiting references

This place does not cover:

Circuits for matrixing

H04N 9/67

Processing of colour television signals in connection with recording
Definition statement

This place covers:

Video data recording:

  • Specially adapted recording devices such as a VCR, PVR, high speed camera, camcorder or a specially adapted PC
  • Interfaces between recording devices and other devices for input and/or output of video signals such as TVs, video cameras, other recording devices
  • Video recorder programming
  • Adaptations of the video signal for recording on specific recording media such as HDD, tape, drums, holographic support, semiconductor memories
  • Adaptations for reproducing at a rate different from the recording rate such as trick play modes and stroboscopic recording
  • Processing of the video signal for noise suppression, scrambling, field or frame skip, bandwidth reduction
  • Impairing the picking up, for recording, of a projected video signal
  • Regeneration of either a recorded video signal or for recording the video signal
  • Video signal recording wherein the recorded video signal may be accompanied by none, one or more video signals (stereoscopic signals or video signals corresponding to different story lines)
  • Production of a motion picture film from a television signal

Details specific to this group:

  • The recording equipment is for personal use and not for studio use
  • The subgroups of H04N 9/79 are for colour video signals
Relationships with other classification places
  • Recording and processing for recording of video signals covered by the subject-matter in the range H04N 5/76 - H04N 5/907 is classified in said range irrespectively of said video signals being in colour or black and white.
  • The range H04N 9/79 - H04N 9/898 deals with recording and processing for recording colour video signals while the corresponding range H04N 5/92 - H04N 5/956 deals with recording and processing for recording black and white video signals.
  • H04N 9/79 (video recording) distinguishes itself from editing, which is found in G11B 27/00, in that the signals recorded and reproduced are video signals.
  • H04N 9/79 is a function place for recording or processing for recording. H04N 21/433 describes applications for recording in a distribution system.
  • H04N 9/79 features recording devices specially adapted to video data recording that can be programmed. The programming may be done by a user or a using an algorithm. Business methods where the video recording feature or step is well known is generally classified in G06Q 30/02 .
  • H04N 9/79 contains recording devices that are characterised by the connection to other devices through an interface. Typically information is sent or received by a recorder through an interface that impacts the recording or playback function. Interfaces in general are found in H04N 5/44.
  • H04N 9/79 contains video cameras that record video data to a recording medium. Video cameras constructional details are found in H04N 23/00.
  • H04N 9/79 is an application place for video data trick play. Reproducing data in general at a rate different from the recording rate is found in G11B 27/005.
  • H04N 9/79 contains applications of video data processing for scrambling/encrypting video data for recording. Systems for rendering a video signal unintelligible are found in H04N 7/16 and H04N 21/00.
  • H04N 9/79 is an application place for video data reduction for recording. Video data compression is found in H04N 19/00.
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Video surveillance:

H04N 7/18

Selective content distribution

H04N 21/00

Controlling video cameras:

H04N 23/60

Alarm system using video cameras

G08B 13/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Production of a video signal from a motion picture film

H04N 5/253

Interfaces

H04N 5/44

Television signal processing in connection with recording, in general

H04N 5/91

Video data coding

H04N 19/00

Network video distribution

H04N 21/236

User interface of set top boxes

H04N 21/47

Video camera constructional details

H04N 23/00

Video data processing for printing

G03F 1/00

Systems for buying and selling, i.a. video content

G06Q 30/00

Business methods related to the distribution of video data content

G06Q 30/02

Video editing

G11B 27/034

Recording techniques specially adapted to a recording medium for recording digital data in general

G11B 27/10

Control of video recorders where the video signal is not substantially involved

G11B 31/00

Special rules of classification

A document does not explicitly mention that the video signal is a monochrome video signal is to be interpreted as being a colour video signal. As a consequence some classes in H04N 5/76 specific to monochrome signal recording have fallen out of use. Instead the corresponding colour symbols should be given to such documents.

Allocation of CPC symbols:

  • A document containing invention information relating to video data recording will be given an H04N 9/79 CPC group.
  • A document containing additional information relating to video data recording (in particular, if the document discloses a detailed video recording device) will be given a H04N 9/79 Indexing Code symbol.
  • A document containing invention information for more than one invention it may be given more than one H04N 9/79 CPC group.
  • A document merely mentioning recording will not be given an CPC group, but it may receive an Indexing Code if the disclosure is considered relevant.
  • Allocation of Indexing Code symbols in combination with CPC:
  • When assigning H04N 9/79 or a subclass thereof as CPC group, giving an additional Indexing Code is optional.
  • Combined use of Indexing Code symbols:
  • Indexing Code symbols maybe allocated as necessary to describe additional information in document.
  • Symbol allocation rules:
  • Documents defining recording devices that have an interface, e.g., connected to a network, should have at least one of the more specific H04N 5/765 Indexing Code symbols.
  • Documents dealing with invention information about measures to prevent recording of projected images should be given the H04N 2005/91392 Indexing Code symbol.
Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Video or video data

video signal analogue or digital with or without accompanying audio

{for more than one standard}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Recording a plurality of video formats

H04N 9/7921

for recording the signal in a plurality of channels, the bandwidth of each channel being less than the bandwidth of the signal (H04N 9/804, H04N 9/81, H04N 9/82 take precedence)
References
Limiting references

This place does not cover:

Transformation of the television signal for recording involving pulse code modulation of the colour picture signal components

H04N 9/804

Transformation of the television signal for recording the individual colour picture signal components being recorded sequentially only

H04N 9/81

Transformation of the television signal for recording the individual colour picture signal components being recorded simultaneously only

H04N 9/82

Informative references

Attention is drawn to the following places, which may be of interest for search:

Television signal processing for bandwidth reduction, by dividing samples or signal segments among a plurality of recording channels, in general

H04N 5/919

Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transformation of the television signal for recording, e.g. modulation, frequency changing, in general; Inverse transformation for playback, applicable to television systems in general

H04N 5/92

Modulation

H03C

Demodulation or transference of modulation from one carrier to another

H03D

involving pulse code modulation of the colour picture signal components
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transformation of the television signal for recording by pulse code modulation, in general; Inverse transformation for playback thereof

H04N 5/926

{involving data reduction}
Definition statement

This place covers:

Coding/decoding when done using an MPEG standard.

{involving the multiplexing of an additional signal and the colour video signal}
Definition statement

This place covers:

Systems, where additional information, necessary to retrieve the video data, e.g., chapter marks, navigation packs, time stamps is recorded with the video information, either on the same recording medium or on an associated recording medium.

Regeneration of colour television signals (H04N 9/80 takes precedence)
References
Limiting references

This place does not cover:

Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback

H04N 9/80

Informative references

Attention is drawn to the following places, which may be of interest for search:

Regeneration of the television signal or of selected parts thereof, in general

H04N 5/93

by assembling picture element blocks in an intermediate memory
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Regeneration of the television signal or of selected parts thereof, by assembling picture element blocks in an intermediate store in general

H04N 5/937

Signal drop-out compensation
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Signal drop-out compensation, applicable to television systems in general

H04N 5/94

for signals recorded by pulse code modulation
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Signal drop-out compensation for signals recorded by pulse code modulation, applicable to television systems in general

H04N 5/945

error detection or correction of digital signals for recording in general

G11B 20/18

Time-base error compensation
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Time-base error compensation, applicable to television systems in general

H04N 5/95

using an analogue memory, e.g. a CCD shift register, the delay of which is controlled by a voltage controlled oscillator
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Time-base error compensation by using an analogue memory, e.g. a CCD-shift register, the delay of which is controlled by a voltage controlled oscillator, applicable to television systems in general

H04N 5/953

using frequency multiplication of the reproduced colour signal carrier with another auxiliary reproduced signal, e.g. a pilot signal carrier {(H04N 9/83 takes precedence)}
References
Limiting references

This place does not cover:

Transformation of the television signal for recording, the recorded chrominance signal occupying a frequency band under the frequency band of the recorded brightness signal

H04N 9/83

Informative references

Attention is drawn to the following places, which may be of interest for search:

Time-base error compensation by using an analogue memory, e.g. a CCD-shift register, the delay of which is controlled by a voltage controlled oscillator, applicable to television systems in general

H04N 5/956

Colour television systems (details H04N 9/00)
Definition statement

This place covers:

Hardware-related or software-related aspects specific to transmission of colour television signal, in particular for transmission of analog colour television signal (e.g. NTSC, PAL, SECAM)

Relationships with other classification places

H04N 11/00 distinguishes itself from transmission systems using pulse code modulation with bandwidth reduction, wherein the chrominance component or any type of colour component is submitted to a processing equivalent to the processing of the luminance component, e.g. MPEG standards, which are found in H04N 7/00, H04N 21/00.

References
Limiting references

This place does not cover:

Colour picture communication systems

H04N 1/46

Details of colour television systems

H04N 9/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

High-definition television systems

H04N 7/015

Special rules of classification

H04N 11/00 features a number of EC symbols corresponding to a same number of Indexing Codes (e.g., H04N 11/14 as EC symbol and H04N 11/14 as Indexing Code symbol)

Allocation of EC symbols and/or Indexing Code symbols:

A document containing invention information relating to colour television systems will be given a H04N 11/00 EC group

A document containing additional information relating to colour television systems will be given a H04N 11/00 EC group

A document merely mentioning details of colour television systems will not be given an EC group, but it may receive an Indexing Code if the disclosure is considered relevant.

with bandwidth reduction (H04N 11/04 {, H04N 11/24} take precedence)
References
Limiting references

This place does not cover:

Using pulse code modulation

H04N 11/04

High definition television systems

H04N 11/24

Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems in which the television signal is transmitted via one channel or a plurality of parallel channels, the bandwidth of each channel being less than the bandwidth of the television signal, in general

H04N 7/12

Methods or arrangements for coding, decoding, compressing or decompressing digital video signals

H04N 19/00

using pulse code modulation {(H04N 11/24 takes precedence)}
References
Limiting references

This place does not cover:

High definition television systems

H04N 11/24

Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems for the transmission of television signals using pulse code modulation, in general

H04N 7/24

Pulse code modulation in general

H03K, H03M

Transmission systems using pulse code modulation, in general

H04B 14/04

using sequential signals only (dot sequential systems H04N 11/12)
References
Limiting references

This place does not cover:

Dot sequential systems

H04N 11/12

Conversion of the manner in which the individual colour picture signal components are combined, e.g. conversion of colour television standards
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Conversion of standards in television systems in general

H04N 7/01

High-definition television systems
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

High-definition television systems in general

H04N 7/015

Stereoscopic video systems; Multi-view video systems; Details thereof
Definition statement

This place covers:

Systems that generate stereoscopic or multi-view signals from cameras, or provide stereoscopic or multi-view signals to displays. It also covers electronic signal processing aspects of such systems.

Examples:

  • Stereoscopic and multi-view electronic image pick up devices (video cameras, digital still cameras)
  • Stereoscopic and multi-view display devices
  • Electronic signal processors: for stereoscopic signal processing; monoscopic to stereoscopic conversion; for stereoscopic image generation (including from a computer model); for stereoscopic displays (e.g. for left/right synchronisation, stereoscopic format conversion or depth adaptation); for displays providing different 2D images to different viewers (e.g. for use in vehicles); for devices that generate a two-dimensional "look around" effect, e.g. non-stereoscopic multi-view systems (see however exclusions here below)
  • Devices generating a real 3D image, i.e. an image having a volume (volumetric displays)
  • Pseudo-stereoscopic systems

Systems in which the viewer's eyes do not see different images, but which may provide a pseudo-stereoscopic effect, are classified in H04N 13/00. The effect must go beyond that provided by the mere display of a 3D object on a 2D screen (like in a CAD system).

Example: Wiggle stereoscopy: pseudo-stereo systems providing a three dimensional effect by means of normal 2D image signals, by periodic oscillating motion of a 3D object.

  • Multi-view systems: systems providing different 2D or 3D views of the same scene to one or more viewers according to the viewpoint location (called "look around" effect); systems providing different 2D or 3D views of different scenes to different viewers (called "privacy" systems)

These systems are classified in H04N 13/00 if they provide said views simultaneously or at least at a sufficiently high frame rate so as to be simultaneously viewed by the viewers.

However, multi-view systems wherein said 2D views are provided to a viewer one at a time, e.g. by user selection, are not classified in H04N 13/00, because they are actually normal 2D systems although the viewpoint can be selected at will.

Examples of multi-view devices falling under H04N 13/00:

  • "look-around" display systems including displays in which a lenticular lens provides different views of a common scene from different viewing positions
  • "privacy" display systems including displays in which a parallax barrier provides different views of different scenes to different viewers in 2D or 3D (for example in a vehicle, wherein on a common screen the driver is watching GPS while the passenger is watching a movie)
  • Multi-user displays displaying different pictures for different viewers wearing shutter glasses to select one of said pictures (this is also "privacy"), wherein said pictures are 2D or 3D pictures.
Relationships with other classification places

Subgroups under H04N 5/00 and H04N 7/00 relate to the basic monoscopic video aspects from which corresponding stereoscopic aspects are derived.

Classification and search in these sections is therefore to be considered every time no specifically stereoscopic aspects are present.

Analysis of video signals to perform real time control of a stereoscopic video cameras, or to identify the image transmission format to drive a stereoscopic display, is classified in H04N 13/00.

Ordinary 2D displays arranged to display solid objects, e.g. in a CAD system, are sometimes called 3D displays. Such displays allow the viewer to rotate 3D objects to see them from any direction. Such displays are not classified in H04N 13/00. This is because a viewer sees the same picture with both eyes and because, if there is more than one viewer, all viewers see the same picture. The manipulation of 3D models or images for computer graphics is covered by G06T 19/00.

Volumetric displays are classified in H04N 13/388 and holographic displays are classified in G03H 1/26, whereas autostereoscopic displays are classified in H04N 13/302.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Projection displays

H04N 5/74, H04N 9/31

Video standard conversion

H04N 7/01

Colour signal processing circuits

H04N 9/64

Video stream synchronisation / multiplexing /packetisation aspects

H04N 21/00

Video signal reformatting

H04N 21/4402, H04N 21/2343

Aspects concerning subtitles or other OSD information

H04N 21/488

Generation or processing of metadata

H04N 21/84

Television cameras

H04N 23/00

Arrangements of television cameras

H04N 23/90

Optical systems

G02B 27/00

Stereoscopic photography

G03B 35/00

Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them

G03H 1/00

Image processing or generation in general

G06T 7/00

Calculation or rendering of a monoscopic view of a 3D graphics object

G06T 15/20

Generation of 3D graphical models or scenes for digital data transmission as such

G06T 17/00

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Stereoscopic

Providing (exactly) two different views, one for the left eye and one for the right eye

2D

Two dimensional

3D

Three dimensional, sometimes also used to mean stereoscopic

Autostereoscopic displays

A display device not requiring glasses to provide a stereoscopic effect to the viewer. A display device not requiring glasses to provide a stereoscopic effect to the viewer. An autostereoscopic display uses a parallax generating optic which projects or displays different images to the viewer, thus creating a sense of depth. The parallax-generating optic may include, for example, parallax barriers; lenticular lenses; an array of controllable light sources or a moving aperture or light source; - fly-eye lenses; dual and multilayer devices that are driven by algorithms to implement compressive light field displays; such devices are also called Content-Adaptive Parallax Barriers; varifocal lenses or mirrors. It is noted that volumetric displays are classified in H04N 13/388 and holographic displays are classified in G03H 1/26, whereas autostereoscopic display are classified in H04N 13/302.

Multi-view

Providing more than two different views to one or more viewers according to their viewing position or direction; the views can be 2D or 3D

Automultiscopic displays

This is a shorter synonym for the expression "multi-view autostereoscopic 3D display"

Volumetric displays

A device generating a "solid" image, i.e. not an image on the surface of a display, but one having a real depth, for example by projecting 2D image slices at different planes within a viewing volume. Such systems have been considered to fall within the definition of stereoscopic systems because the viewer's eyes perceive two different pictures.

Lenticular lenses

An array of thin cylindrical lenslets (normally less than 1mm wide) placed vertically in front of, or behind a display or light modulator in order to generate optically directive views in autostereoscopic displays or cameras.

Parallax barriers

An array of opaque strips and thin slits arranged to occlude portions of a displayed image in left and right viewing regions. The slits are spatially arranged to ensure that the left/right image portions are only visible in the corresponding left/right viewing regions for which they are intended. The parallax barrier may be provided by a static physical layer in which the slits are precisely positioned, or electronically generated on an adaptive intermediate LCD layer. The parallax barrier may also be adjacent to camera circuitry for image collection.

Fly-eye lenses

An array of very small bidimensional lenses (typically circular / hemispherical) placed in front of a display, light modulator or image sensor like a normal lenticular lens, providing bidimensional parallax.

Pseudo-stereoscopic

Relating to stereoscopic or 3D visual effects obtained without sending different views to the viewer's eyes. The same term is sometimes used to denote the effect whereby the left and right images are seen by the wrong eyes, due to viewing from an unsuitable position in front of an auto-stereoscopic display.

Integral imaging

A technique of image capture or display which uses a fly's eye or a lenticular lens in front of the image sensor/display in order to capture/display images with parallax.

Plenoptic cameras

A camera, normally non-stereoscopic, using a technique allowing focusing after image capture, by means of a lenticular lens array combined with a plurality of (small) image sensors. A plenoptic camera is also known as a light-field camera.

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

LCD

Liquid Crystal Display

SLM

Spatial Light Modulator

OSD

On-Screen Display

CAD

Computer Aided Design

DMD

Digital Micromirror Device

In patent documents, the following words/expressions are often used as synonyms:

  • "3D" and "stereoscopic"
  • "automultiscopic" and "multi-view autostereoscopic"
  • "lenticular screen", "lenticular lens array" and "lenticular array"
  • "plenoptic camera" and "light-field camera"
Processing, recording or transmission of stereoscopic or multi-view image signals
Definition statement

This place covers:

Device-independent processing of stereoscopic or multi-view image signals

Processing image signals (for multi-view video sequence encoding H04N 19/597)
References
Limiting references

This place does not cover:

Multi-view video sequence encoding

H04N 19/597

Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
Special rules of classification

Attention should be paid to the word "transformation": here a new virtual image is generated starting from one or more already existing stereoscopic images, e.g. by interpolation. In contrast new computer-generated stereoscopic images not derived from existing images are classified in H04N 13/275.

Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues (H04N 13/128 takes precedence)
Definition statement

This place covers:

Modification of image signals to enhance the viewer's perception of the 3D effect. Such modification may include:

  • Addition of depth cues such as defocusing, colouring, shadows
  • Geometric correction or warping
  • Left/right or temporal crosstalk reduction
Relationships with other classification places

If the content is not modified, this group is not relevant.

If the 3D impression is improved by horizontally shifting one of the images with respect to the other, or by modifying the depth map, then the document should be classified in H04N 13/128.

References
Limiting references

This place does not cover:

Adjusting depth or disparity

H04N 13/128

Adjusting depth or disparity
Definition statement

This place covers:

Depth adjustment, e. g.:

  • Control of disparity between L and R images
  • Processing of depth maps
  • Non-linear processing of depth in order to adapt it to display features such as screen size
Relationships with other classification places

Reduction of depth parameters to reduce eye strain (fatigue) caused by flicker should be classified here and in H04N 13/144, providing the depth parameters are controlled by the image signal and not by the display parameters.

If depth adjustment is obtained by acting only on device parameters, i.e. there is no stereoscopic image signal processing, the document should not be classified here but only in the relevant device groups, H04N 13/20 and H04N 13/30.

For example, if depth is adjusted by controlling the baseline (the physical distance between two cameras of a stereo camera), the adjustment should be classified in H04N 13/239 in combination with H04N 13/296.

Format conversion, e.g. of frame-rate or size
Definition statement

This place covers:

Conversion of any kind of stereoscopic format into another one, e.g. from side-by-side to top-bottom or "2D+depth", or still to side-by-side but with a different size, resolution or frame rate

Relationships with other classification places

The generation of stereoscopic signals from monoscopic source signals is classified in H04N 13/261 or in relevant groups under H04N 13/20. Format conversion should be classified here only if it concerns stereoscopic (or multi-view) signals and if the conversion goes beyond the equivalent processing of monoscopic image signals.

Standards conversion of monoscopic TV signals (e.g. PAL to NTSC), or the adaptation of signals to the display format of a display terminal, should be classified in H04N 7/01.

Mixing image signals
Definition statement

This place covers:

The generation of stereoscopic (or multi-view) images from at least two source images, wherein the contents of both source images remain visible in the resultant mixed image, i.e. the generation of one image including the weighted sum of said two source images.

Relationships with other classification places

The reproduction of mixed stereoscopic images or mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay on a monoscopic image background, is classified in H04N 13/361.

Overlays such as subtitles and similar graphic images are to be classified in H04N 13/183.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Mixing monoscopic television image signals

H04N 5/265

Encoding, multiplexing or demultiplexing different image signal components (for multi-view video sequence encoding H04N 19/597)
Definition statement

This place covers:

  • Aspects of manipulating the structure of the stereoscopic video signal, i.e. how the different image signals which constitute a stereoscopic (or multi-view) image signal are encoded or combined in order to form a complete video signal, e.g. for storage or transmission.
  • The separation of stereoscopic or multi-view image signals into their respective constituent (e.g. left and right) components.

"Multiplexing" and "demultiplexing" are to be interpreted in the general sense mentioned above, i.e. any manner of forming a stereoscopic image frame, stream or signal from e.g.

  • left and right signals
  • a 2D image and a depth image by arranging the components in a format having e.g.
  • alternate L/R frames or fields
  • side by side L/R images
  • top/down L/R images
  • main layer / enhancement layer
  • component images having different resolutions
Relationships with other classification places

Aspects relating to the general encoding of stereoscopic or multi-view image signals are classified here. Prediction encoding to compress the image signal (e.g. using temporal or spatial prediction techniques) specially adapted for multi-view video sequences is classified in H04N 19/597.

Further, attention should be paid to the term "image signal components" which is used in a strict sense. Non-image signal components are to be classified in H04N 13/172 and subgroups thereof.

References
Limiting references

This place does not cover:

Prediction encoding specially adapted for multi-view video sequences

H04N 19/597

Metadata, e.g. disparity information
Definition statement

This place covers:

Metadata concerning stereoscopic features included in a stereoscopic video stream or image file

On-screen display [OSD] information, e.g. subtitles or menus
Definition statement

This place covers:

Details relating to subtitles or other OSD information, which are included in a stereoscopic video stream separate from the image(s), e.g. information describing how to merge subtitles with the main image, or how to avoid depth conflicts, depth interference etc.

Recording image signals; Reproducing recorded image signals
Relationships with other classification places

This group is used to classify aspects concerning the recording of stereoscopic or multi-view image signals and the reproduction thereof. Recording of monoscopic video signals and monoscopic aspects of stereoscopic video signals is classified in H04N 5/76.

Transmission of image signals
Relationships with other classification places

This group is used to classify aspects relating to the transmission of stereoscopic or multi-view image signals. Such aspects are often quite close to the corresponding monoscopic ones, because once a stereoscopic video stream has been assembled, it is generally recorded or transmitted with monoscopic techniques. Transmission of monoscopic image signals is classified elsewhere in H04N, e.g. H04N 5/38, H04N 21/00 (for selective content distribution systems).

Image signal generators
Definition statement

This place covers:

  • The generation of electronic image signals representative of stereoscopic or multi-view images
  • Computer-generated stereoscopic or multi-view image signals;
  • Signal processing and control systems therefor.

Note:

The generated stereoscopic signals may be in any format, e.g. L + R, 2D +depth map, 3D + depth map. Note however that the devices which do not capture optical images (e.g. 3D scanners, time-of-flight cameras, rangefinders etc.) are not classified in H04N 13/00: they are classified in the groups indicated here below.

Relationships with other classification places

Monoscopic plenoptic cameras generating a single viewpoint are classified in the relevant groups H04N 23/00.

Plenoptic cameras / integral imaging cameras, which provide more than one viewpoint, are to be classified in H04N 13/20, in particular in H04N 13/282 if they provide more than two different geometrical viewpoints.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Projection displays

H04N 5/74, H04N 9/31

Recording, including multiplexing another television signal

H04N 5/92, H04N 9/82

Video standards conversion

H04N 7/01

Colour signal processing circuits

H04N 9/64

Video stream synchronization / multiplexing / packetization aspects

H04N 21/00

Video signal reformatting

H04N 21/4402, H04N 21/2343

Aspects concerning subtitles or other OSD information

H04N 21/488

Generation or processing of metadata

H04N 21/84

Television cameras

H04N 23/00

Arrangements of television cameras (not for capturing stereoscopic images)

H04N 23/90

Time-of-flight [TOF] cameras

G01S 17/08

Optical systems for producing stereoscopic or other three dimensional effects

G02B 30/00

Stereoscopic photography by sequential recording

G03B 35/02

Stereoscopic photography by simultaneously recording

G03B 35/08

3D scanners

G06F 3/01

Depth or shape recovery

G06T 7/50

Generation of a depth map from stereoscopic image signals

G06T 7/593

Calculation or rendering of a monoscopic view of a 3D graphics object

G06T 15/20

Generation of 3D graphical models or scenes

G06T 17/00

Manipulating 3D models or images for computer graphics

G06T 19/00

using temporal multiplexing
Definition statement

This place covers:

Alternate acquisition of images from different viewpoints, each image acquired at a different time

using spectral multiplexing
Definition statement

This place covers:

Simultaneously capturing images from several geometrical viewpoints, each image having different spectral characteristics

using spatial multiplexing
Definition statement

This place covers:

Simultaneously capturing images from several geometrical viewpoints on different parts of the image pickup sensor

using fly-eye lenses, e.g. arrangements of circular lenses
Special rules of classification

Plenoptic cameras, i.e. lens array cameras for providing stereoscopic or 3D images, are classified here even if each lens of the fly-eye lens is placed on a different chip (the image sensor is considered to be one even if it is composite)

using two 2D image sensors having a relative position equal to or related to the interocular distance (H04N 13/243 takes precedence)
References
Limiting references

This place does not cover:

using three or more 2D image sensors

H04N 13/243

Calibration of cameras
Definition statement

This place covers:

Aspects relating to the control of a stereoscopic camera in order to obtain aligned images, i.e. images that only differ by a horizontal disparity, but that have no relative rotation, or other geometric distortion, there between.

Relationships with other classification places

The so-called stereo (camera) calibration aspects wherein an already captured image pair is processed to determine and compensate the same above mentioned distortions are to be classified in G06T 7/80 such aspects differing from the aspects classified in this group in that they do not "relate to the control of a stereoscopic camera".

in combination with electromagnetic radiation sources for illuminating objects
Definition statement

This place covers:

Aspects relating to the use of light for obtaining a stereoscopic image, e.g. illumination with structured light in order to capture depth, or illumination from different sides or with different colours to obtain left and right images.

Relationships with other classification places

Normal illumination devices (flash or continuous illumination) are classified in H04N 23/00 and if exposure aspects are involved, in H04N 23/70. If structured illumination is used for measuring contours or curvatures, see G01B 11/25. Procedures and apparatus for illuminating a scene in general, see G03B 15/02.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Laser ranging using the projection of structured light to facilitate image analysis for depth or shape recovery

G06T 7/521

with monoscopic-to-stereoscopic image conversion
Definition statement

This place covers:

Devices obtaining a stereoscopic image from one or more existing monoscopic image

Relationships with other classification places

In this group the capturing conditions of the monoscopic images are unknown or irrelevant, whereas in H04N 13/207 and subgroups stereoscopic images are generated from a camera controlled to provide images of different viewpoints, so that no "conversion" is necessary.

from 3D object models, e.g. computer-generated stereoscopic image signals
Definition statement

This place covers:

Systems using a computer for generating a stereoscopic images, e.g. a fully synthetic stereoscopic image from a CAD-type 3D object model

Relationships with other classification places

The generation of a new image from a virtual viewpoint from existing stereoscopic images is covered in H04N 13/111 and its subgroup. 3D modelling for computer graphics G06T 17/00.

Image reproducers (optical systems for producing stereoscopic or other three-dimensional effects G02B 30/00)
Definition statement

This place covers:

  • Devices for stereoscopic or multi-view electronic image signal display.
  • Devices for electronic image signal display for generating different views of a scene according to the viewpoint location.
  • Devices for electronic image signal display for generating different views for different viewers.
  • Devices for electronic image signal display for generating a view visible only by a specific viewer.
  • Devices for volumetric three dimensional electronic image signal display.
  • Devices for pseudo-stereoscopic display systems. For example: wiggle stereoscopy or pseudo-stereo systems providing a three-dimensional effect by means of normal 2D image signals, by periodic oscillating motion of a 3D object.
  • Devices which generate different two-dimensional views in the vertical direction by using horizontally arranged parallax optic and displaying different images in the vertical direction.
  • Electronic signal processing and control therefor. For example signal processors and controllers.
  • for left/right synchronization, stereoscopic format conversion or depth adaptation
  • for backlight control or electrical control of properties of a lenticular lens
  • for providing different 2D images to different viewers (e.g. for use in vehicles)
  • for devices which generate a two-dimensional "look around" effect, e.g. non-stereoscopic multi-view systems, when the user's position is tracked or when different images are displayed in the vertical direction on a display using a horizontally arranged parallax optic
  • for controlling image flipping (or inverse image), caused by the noticeable transition between the viewing zones
  • for controlling picket fence effect, a moiré-like artefact caused by the gaps between sub-pixels being magnified by the lenticular sheet, for example by use of a slanted parallax optic. (Blurring the boundaries between the viewing zones can increase the apparent number of views, broadening the observation angle of the pixels)
  • for reducing of ghosting or crosstalk
  • for controlling resolution loss of images with high perceived depth, for example by controlling the distance between the pixels and the array of lenticular lens elements,
  • for controlling the stereoscopic image generation in dependence on the user position and orientation
  • for controlling the stereoscopic image generation in dependence on the display position and orientation
  • Constructional arrangements and manufacturing methods for stereoscopic display devices for example details related to:
  • colour pixel arrangement with respect to the parallax optic layout or shape of pixels
  • mechanical control of position of the parallax optic user interfaces for controlling or indicating the stereoscopic image display properties, like amount of displayed depth or switching between 2D/3D mode
  • arrangements for improving the stereoscopic impression, e.g. by using an additional frame placed in front of the screen
References
Limiting references

This place does not cover:

Optical systems for producing stereoscopic or other three dimensional effects

G02B 30/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Holographic volumetric displays

G03H 1/26

Synonyms and Keywords

In patent documents, the following words/expressions are often used as synonyms:

In patent documents, the expression " multi-view display" is often used as synonym for describing "privacy" displays, for example, multi-user displays displaying different pictures for different viewers wearing shutter glasses to select one of said pictures (this is also "privacy"), wherein said pictures may be 2D or 3D pictures. Such type of privacy displays are not multi-view displays for the purpose of this classification. However, these privacy display devices (which, for example, use an image separation optic, e.g. a parallax optic, a shutter or polarisation glasses for generating privacy images for a specific viewer) also fall under H04N 13/30.

In patent documents, the following words/expressions are often used with the meaning indicated:

In patent documents, the expression "Three dimensional (3D)" is often used with the meaning "stereoscopic". However, this expression has a broader meaning and encompasses for instance 2D images displayed with monoscopic depth cues, computer generated (CG) 3D models or stack of images arranged in depth direction (e.g. tomographic images).

for viewing without the aid of special glasses, i.e. using autostereoscopic displays
Definition statement

This place covers:

Electronic signal processors and controllers specially adapted for driving and controlling of autostereoscopic displays, automultiscopic displays, integral imaging displays or privacy displays using a parallax generating optic which projects or displays different images to the left and right eyes, thus creating a sense of depth. The parallax generating optic may include:

  • parallax barriers;
  • lenticular lenses;
  • an array of controllable light sources or a moving aperture or light source;
  • a fly-eye lens;
  • dual and multilayer devices that are driven by algorithms such as computed tomography and non-negative matrix factorisation and non-negative tensor factorisation to implement compressive light field displays; such devices are also called Content-Adaptive Parallax Barriers;
  • a varifocal lens or mirror.

Constructional arrangements and manufacturing methods for autostereoscopic displays, automultiscopic displays, integral imaging displays or privacy displays, for example, details related to the colour pixel arrangement with respect to the parallax barrier, layout or shape of pixels or mechanical control of position of the lenticular lens.

Illustrative examples

media124.png

(Autostereoscopic displays showing dependence on the user position - blended zones where the left and right images are seen with both eyes, inverse image zone where the left image is seen by the right eye and the right image is seen by the left eye)

Relationships with other classification places

Volumetric displays and holographic displays are not autostereoscopic displays for the purpose of this group. Examples of relevant classification places for volumetric and holographic displays can be found under the informative references below.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Constructional details related to television receivers

H04N 5/64

Volumetric displays, i.e. systems where the image is built up from picture elements distributed over a volume

H04N 13/388

Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics

G02F 1/00

Structural association of optical devices, e.g. polarisers, reflectors or illuminating devices, with liquid crystal display cells

G02F 1/335

Stereoscopic photography by sequential viewing

G03B 35/16

Stereoscopic photography by simultaneous viewing

G03B 35/18

Stereoscopic photography by simultaneous viewing using aperture or refractive resolving means on screen or between screen and eye

G03B 35/24

Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them

G03H 1/00

Constructional details related to the housing of computer displays

G06F 1/16

Advertising or display means

G09F 19/00

Displaying different signs depending upon the view-point of the observer

G09F 19/14

Involving the use of mirrors

G09F 19/16

Control arrangements or circuits to produce spatial visual effects, for example rotating displays

G09G 3/00

Special rules of classification

This group should be assigned when no explicit reference to the particular type of the autostereoscopic display device is disclosed and when the autostereoscopic display device is not defined in the subgroups.

media125.png

(Autostereoscopic displays using an array forming a moving aperture (120) and lenticular lens (110) placed behind the display should be classified in H04N 13/305 in combination with H04N 13/32)

media126.png

(Autostereoscopic displays using an array of controllable light sources and lenticular lenses placed in front of and behind the display should be classified in H04N 13/305 in combination with H04N 13/32)

using lenticular lenses, e.g. arrangements of cylindrical lenses
Definition statement

This place covers:

Electronic signal processors and controllers specially adapted for driving and controlling of autostereoscopic displays using lenticular lenses, as well as constructional arrangements and manufacturing methods for such autostereoscopic display devices.

Autostereoscopic displays using lenticular lenses are not limited to the cases where the lenticular lenses are arranged in front of the display only.

Illustrative examples

media37.png

(Autostereoscopic displays using lenticular lenses placed in front of the display)

media38.png

(Electrical control of properties of the lenticular lens

media39.png

(In the first mode, the light output directing function is provided by an array 35 that is closer to the display panel 33. This mode provides a limited amount of perceived depth, but high resolution, and is therefore suitable for use in a "monitor" application where high resolution is more important. In the second mode, the light output directing function is provided by an array 37 that is further from the display panel 33.)

Special rules of classification

The present group should be assigned in combination with other groups of H04N 13/00 for example:

a lenticular lens is used in combination with a moving aperture or controllable light sources

H04N 13/32

when the lenticular lens is used in combination with a parallax barrier

H04N 13/31

when the lenticular lens is slanted

H04N 13/317

when the autostereoscopic display is for multi-view display

H04N 13/349

when the user is tracked

H04N 13/366

Synonyms and Keywords

In patent documents, the following words/expressions are often used as synonyms:

In patent documents, the expressions " lenticular" and "lens array" are often used as synonyms, although these terms may also be used for fly-eye lens arrays.

using fly-eye lenses, e.g. arrangements of circular lenses
Definition statement

This place covers:

Volumetric or integral imaging displays that use a fly-eye lens array.

Integral imaging systems consisting of a two-dimensional (2D) lens array and display system. An elemental image on a 2D panel gives a different perspective to each elemental lens, as shown in the figure bellow. The lens array integrates the elemental images to form a 3D image with full parallax (horizontal and vertical) and an almost continuous view.

media40.jpg

media127.png

(A display comprises a pixel array (104) and an optical element array (102) disposed in close proximity to the pixel array. The pixel array is operated to display two or more images. The optical element array is configured and operated to direct each image to an associated viewing position, enabling a viewer to separately view each image from the respective associated viewing position.)

media128.png

(A naked eye stereoscopic display includes a plurality of projectors (1), a microlens array (2) for collecting light beams of an image projected from the projectors, and a diffuser panel (120) for diffusing the light beams collected by the microlens array. Furthermore, the diffuser panel is arranged such that a virtual light collection point is formed among a plurality of light collection points of light beams by a plurality of microlenses constituting the microlens array.)

Special rules of classification

This group is the only group where integral imaging displays are classified.

Synonyms and Keywords

In patent documents, the following words/expressions are often used as synonyms:

  • "microlens array", "lens array" and "fly-eye lens array"
using parallax barriers
Definition statement

This place covers:

Autostereoscopic displays which use parallax barriers. A parallax barrier is a device placed in front of or behind an image source, such as a liquid crystal display, to allow it to show a stereoscopic image or multiscopic image without the need for the viewer to wear 3D glasses.

media43.png

(A display panel 10 and a parallax barrier 20)

Special rules of classification

This group should always be assigned when the parallax barrier is a device placed in front of the image source.

This group should be assigned in combination with other groups of H04N 13/30 for example:

a parallax barrier is used in combination with a moving aperture or controllable light sources

H04N 13/32

when the user is tracked

H04N 13/366

the parallax barriers being placed behind the display panel, e.g. between backlight and spatial light modulator [SLM]
Definition statement

This place covers:

Autostereoscopic displays which use a parallax barrier behind an image source. If the parallax barrier is placed behind the LCD pixels, the light from a slit passes the left image pixel in the left direction, and vice versa. This produces the same basic effect as a front parallax barrier. In both cases the image displayed is column interlaced.

media44.png

(The parallax barrier slit is visible at the same horizontal position within each pixel (R pixel, L pixel) of one view (R, L)).

media45.png

(Figure 1 shows two images displayed on the display layer 4, with the two images displayed on alternate columns of pixels; one image is displayed on pixel columns Cl, C3, C5 and a second image is displayed on pixel columns C2, C4, C6. The image display device is illuminated by light 7 from a light source.)

Special rules of classification

This group should be assigned always when the parallax barrier is placed behind the image source.

Optical masks which form part of a controllable light source should not be classified in the group, but in H04N 13/32.

using slanted parallax optics
Definition statement

This place covers:

Autostereoscopic displays where the parallax optic, for example a lenticular lens or parallax barrier is slanted with respect to the pixels matrix of the SLM.

Autostereoscopic displays where the pixels or the pixel matrix is slanted with respect to the parallax optics.

In 1996, van Berkel proposed that the lenticular sheet could be placed at a slant over a standard LCD screen. This approach removes the picket fence effect, creates smooth transition between the views and at the same time balances the horizontal vs. vertical resolution of a view. Another solution with similar effects is called "wavelength-selective filter array". Essentially, the filter is a slanted parallax barrier which covers the display and defines particular light penetration direction of each sub-pixel.

Illustrative examples

media129.png

(A display panel (10) which includes a plurality of pixels (21) arranged in a plurality of coloured sub-pixels and displays an image frame; a viewing area separating unit (120) arranged as a filter in front of the display panel.)

media47.png

(The display system 100 comprises a pixel array 102 and lenses 106 disposed over the pixel array 102. In an embodiment, pixel array 102 may include pixels 104 that are slanted relative to the lenses 106.)

Special rules of classification

This group should be assigned in combination with other groups of H04N 13/30, for example:

Colour aspects of stereoscopic or multi-view image producers, e.g. for control or arrangement of colour sub-pixels

H04N 13/324

using arrays of controllable light sources; using moving apertures or moving light sources
Definition statement

This place covers:

Autostereoscopic displays using controllable light sources or arrangements, adjustment of which directs the light in different directions, so as to direct a displayed image (or portion thereof) toward a viewer's eye.

Autostereoscopic displays in which the direction of the displayed image is manipulated by movement of apertures, by movement of light sources or by using optical masks that form part of a controllable light source.

Illustrative example

media48.png

(The backlight module (1) comprises a first light guide plate (21) and a second light guide plate (22) which are stacked, and comprises a first light source (11) disposed opposite to the first light guide plate (21), and a second light source (12) disposed opposite to the second light guide plate (22).)

Relationships with other classification places

Illumination arrangements using parallax barriers are classified in this group and not in H04N 13/312.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Light guides

G02B 6/00

Special rules of classification

Backlight modules commonly comprise lenticular lenses or parallax barriers. In such cases, this group should be assigned in combination with:

using a lenticular lens

H04N 13/305

using a parallax barrier

H04N 13/31

In these examples the lenticular lenses or parallax barriers are part of the backlight modules:

media130.png

media131.png

(Fig. 2 and 4 show 3D display systems that use a lenticular lens 22 or a parallax barrier 26, along with a shutter plate 30, as a light directing device to allow a viewer's right eye to see a right image and the left eye to see a left image on a display panel. The right and left images are alternately displayed.

Although the parallax barrier 26 is placed behind the spatial light modulator [SLM] 10, H04N 13/312 shall not be allocated, since it is a part of the controllable illumination arrangement.)

using varifocal lenses or mirrors
Relationships with other classification places

Volumetric display systems where the image is built up from picture elements distributed over a volume are classified in H04N 13/388.

Calibration thereof
Definition statement

This place covers:

  • Colour or brightness adjustment with respect to the stereoscopic images when considering specific optical and constructional properties of a specific stereoscopic type display
  • Geometric correction of stereoscopic images with respect to errors arising from the relative positions between the different optical elements, such as the pixels and the parallax optic;
  • Mechanical or electrical change of properties or position of optical elements, such as the lenticular screen, to compensate for misalignments between the optical elements.

Calibration can be performed automatically or by the user when viewing a predetermined calibration or test image

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Improving the 3D impression of a stereoscopic image by modifying image signal contents

H04N 13/122

Equalising the characteristics of different image components in stereoscopic images, e.g. colour balance

H04N 13/133

Special rules of classification

This group should be always assigned in combination with a respective display type. For example, calibration of autostereoscopic displays should be classified in both H04N 13/302 and H04N 13/327.

using spectral multiplexing
Definition statement

This place covers:

  • Stereoscopic displays using an anaglyph display method, e.g. by displaying the image for each eye using filters of different (usually chromatically opposite) colours, typically red and cyan. When viewed through "anaglyph glasses", wherein each lens comprises a corresponding colour filter, an integrated stereoscopic image is perceived by the viewer.
  • Stereoscopic displays using a full colour anaglyph display method, in which different images represented by triplets of slightly different primary colours (e.g. RLGLBL and RRGRBR) are presented to the left and right eyes respectively and viewed through glasses with selective filters. This technique may also be referred to as 'wavelength multiplex visualization'.
  • Stereoscopic displays using Pulfrich display method obtained from a light/dark filter arrangement.

An example of spectral multiplexing comprises simultaneously displaying left and right images separated by using glasses with different spectral characteristics

media51.png

In the example, the dotted lines represent the wavelengths seen by the left eye and the continuous lines represent those seen by the right eye. The left eye sees RGB image components of slightly different wavelengths than those seen by the right eye. When provided with the correct set of filters, e.g. Fabry-Perot filters, which let through light within limited, chosen ranges of wavelengths, the viewer will perceive a full colour stereoscopic image.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Stereoscopic photography by simultaneous viewing using polarised or coloured light for separating different viewpoint images

G03B 35/26

using polarisation multiplexing
Definition statement

This place covers:

  • Display systems which display stereoscopic images simultaneously or sequentially, each image presented by light of a different polarisation. Such systems conventionally require passive glasses having different polarising characteristics for each eye.
  • Display systems which display different images simultaneously or sequentially for different viewers wearing glasses having differing polarising characteristics. Viewers wearing differently polarised glasses see different displayed images (e.g. "privacy" displays).

media52.png

An example of using polarisation multiplexing comprises simultaneously displaying left and right images which are separated by using glasses with different polarising characteristics

media53.jpg

(Odd pixel lines (running horizontally) are rotated clockwise, and even pixels line counter-clockwise, using circular polarisation. Viewing glasses make it possible for the right eye to see only the odd lines, and the left only the even lines, again using polarising films, producing the 3D image.)

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Stereoscopic photography by simultaneous viewing using polarised or coloured light for separating different viewpoint images

G03B 35/26

Special rules of classification

This group should not be assigned for shutter type displays which use polarisers in the glasses as part of the shutter system for completely blocking the light, but H04N 13/385.

However, this group should be assigned in combination with H04N 13/341 for cases when both eyes see an image, if polarisation alternating glasses are used.

using spatial multiplexing (H04N 13/337 takes precedence)
Definition statement

This place covers:

Formation of a stereoscopic image by simultaneously displaying left and right images on different parts of a display and using glasses to optically recombine the stereoscopic image, e.g. with prisms or mirrors

media54.png

References
Limiting references

This place does not cover:

Stereoscopic displays using polarisation multiplexing, for simultaneously displaying left and right images

H04N 13/337

using temporal multiplexing
Definition statement

This place covers:

Formation of a stereoscopic image by alternately displaying left and right images separated in time and by using glasses, e.g. with shutters, alternately to block the right and left eye.

Shutter type display systems using a frame sequential method of displaying 3D images. Full high-definition (HD) images are alternated between left and right eyes each frame, using glasses with synchronised liquid crystal shutters alternately to block left and right eye vision.

Frame sequential methods of displaying 3D images when the optical properties, such as colour filtering or polarisation characteristics, of each lens of the shutter glasses are alternated with each frame, i.e. active glasses.

Shutter type display systems using frame sequential method of displaying different pictures for different viewers wearing shutter glasses to select one of said pictures ("privacy"), wherein said pictures are 2D or 3D pictures.

media55.png

(Timing diagrams for shutter type stereoscopic displays showing synchronisation between the LCD panel, the LED backlight and the Glasses of Shutter type stereoscopic displays)

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Frame sequential stereoscopic displays using passive glasses

H04N 13/334, H04N 13/337

with head-mounted left-right displays
Definition statement

This place covers:

  • head-mounted displays for stereoscopic viewing
  • head-mounted displays specially adapted for augmented reality systems
  • head-mounted displays comprising viewer tracking for generating look around images

media56.png

(The head-mounted display 100 illustrated has display panels 104L and 104R for the left eye and the right eye at a side surface facing a face of a user.)

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Optical head-up displays

G02B 27/01

Manipulating 3D images for computer graphics, e.g. for virtual reality (VR) or augmented reality (AR) display

G06T 19/00

Special rules of classification

This group maybe assigned in combination with several further groups if the head mounted display is used for example in augmented or mixed reality systems or if the user position is tracked. For example:

Stereo video generation from a 3D object model, e.g. computer-generated stereoscopic image signals

H04N 13/275

Mixing stereoscopic image signals

H04N 13/156

using prisms or semi-transparent mirrors
Definition statement

This place covers:

  • 3D display arrangements which use a semi-transparent mirror or prism for optically mixing or separating left and right images.
Special rules of classification

This group should normally be assigned in combination with a respective display type, for example:

Volumetric display with depth sampling

H04N 13/395

stereoscopic displaying with polarisation multiplexing, for simultaneously displaying left and right images

H04N 13/337

Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking (for viewing without the aid of special glasses using fly-eye lenses H04N 13/307)
Definition statement

This place covers:

Multi-view displays which simultaneously or sequentially display multiple (three or more) viewpoints (perspectives or views) of the same scene in different directions (zones, lobes, cones) with respect to the optical axis of the display in order to generate a look-around effect (motion parallax) when the user moves around the display. The viewpoints (views) are displayed irrespective of whether the viewer is tracked or not.

Multi-view displays which simultaneously or sequentially display multiple (three or more) viewpoints of different scenes in different directions (zones, lobes, cones), for example for privacy displays. The viewpoints (views) are displayed irrespective of whether the viewer is tracked or not.

Multi-view displays that display three or more viewpoints (perspectives or views of one or more scene) in different directions (zones, lobes, cones) with respect to the optical axis of the display.

The definition "without viewer tracking" does not mean that such display systems do not include viewer tracking. Some displays can include viewer tracking, e.g. for preventing image flipping, but not for the creation of the multi-view effect as such.

References
Limiting references

This place does not cover:

Autostereoscopic displays using fly-eye lenses

H04N 13/307

Informative references

Attention is drawn to the following places, which may be of interest for search:

Volumetric displays

H04N 13/388

Special rules of classification

The generation of multiple viewpoints (look around or motion parallax effect) of a scene according to the viewer position is classified in H04N 13/117 for the image signal processing aspects or in H04N 13/279 for the image signal generation aspects. The displaying of such viewpoints on a display to simulate a look around effect does not mean that the display is multi-view. Therefore, such stereoscopic systems should be not classified in H04N 13/349 if they do not comprise a multi-view display as defined above.

This group should normally be assigned in combination with a respective display type, which is normally of autostereoscopic type.

media57.png

The term "multi-view" is also used for privacy display devices which display different video content to different viewers. Such displays however are not necessarily multi-view displays if they cannot generate multiple (three or more) viewpoints and cones irrespective of whether the viewer is tracked or not. For example, one type of privacy displays (similar to an autostereoscopic display) displays (only) two different views in two different directions. Such a privacy display does not fall into the above definition for Multi-view displays and should be not classified in H04N 13/349.

for displaying sequentially
Definition statement

This place covers:

Sequential display of different images for different viewpoints at different time intervals. By controlling the display with a sufficiently high frame rate, viewers at different viewpoints will see different content, depending upon their position.

media58.png

(A first lens structure LS1 emits two viewpoint images displayed on the display panel 200 to viewpoint positions VW1, VW2, during the first interval of the frame. Then, the second lens structure LS2 emits two viewpoint images displayed on the display panel 200 to viewpoint positions VW3, VW4, during the second interval of the frame.)

having separate monoscopic and stereoscopic modes
Definition statement

This place covers:

Stereoscopic displays that are selectively switchable between a monoscopic (2D) mode and a stereoscopic (3D) mode.

The change in mode may be effected by electrically or mechanically modifying the properties of the display device or by change of the image content - for example:

  • by switching off the parallax optic
  • by removing the parallax optic
  • by controlling the shutter glasses
  • by displaying the same image in a stereoscopic display mode
Special rules of classification

This group should be always assigned in combination with groups representing the respective display type.

Switching between monoscopic and stereoscopic modes
Definition statement

This place covers:

Details relating to the switching of the display between monoscopic and stereoscopic modes, e.g.:

  • Synchronisation between the displayed image and the time of switching off the parallax barrier or the shutter glasses
  • Control of backlight level or brightness in 2D mode and in 3D mode
  • The display of warning messages before switching to 2D mode
  • Switching to 2D mode upon detection of a specific event, like detection of user fatigue, or that a user's position is not suitable for stereoscopic viewing
Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
Definition statement

This place covers:

Displays capable of simultaneously displaying both monoscopic and stereoscopic video content.

Image display control, e.g. determining the position at which a parallax barrier should be activated to display a stereoscopic image upon a monoscopic image background.

media59.png

(Among the plurality of pixel regions included in the display panel 10, in the pixel region in which the user views an image through a region 20a in the selectively light-blocking panel 20, a 3D image (L) for the left eye and a 3D image (R) for the right eye are displayed, whilst 2D images are displayed in the pixel regions other than the region.)

Relationships with other classification places

Generating mixed monoscopic and stereoscopic images when the mixing is performed irrespective of the display type is not covered by this group.

Examples of classification places which may be relevant for search, e.g. mixing a stereoscopic GUI or subtitles with a stereoscopic or monoscopic image, can be found in the informative references below.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Mixing stereoscopic image signals

H04N 13/156

Subtitles or other on-screen display [OSD] information, e.g. menus

H04N 13/183

Special rules of classification

This group should be always assigned in combination with a respective display type, for example:

for autostereoscopic displays

H04N 13/302

using image projection screens (volumetric display H04N 13/388)
Definition statement

This place covers:

Stereoscopic display systems using projection devices

References
Limiting references

This place does not cover:

Volumetric displays

H04N 13/388

Informative references

Attention is drawn to the following places, which may be of interest for search:

Projection displays

H04N 5/74, H04N 9/31

Stereoscopic photography by simultaneous viewing using two or more projectors

G03B 35/20

Stereoscopic photography by simultaneous viewing using single projector with stereoscopic-base-defining system

G03B 35/22

Special rules of classification

This group should be always assigned in combination with a respective display type, for example:

projection devices for autostereoscopic displays

H04N 13/302

using digital micromirror devices [DMD]
Definition statement

This place covers:

  • Stereoscopic or multi-view display systems using micromechanical devices, e.g. MEMS mirror devices or DMD based spatial light modulators (SLMs)
using viewer tracking
Definition statement

This place covers:

Different aspects of viewer tracking for control of stereoscopic systems, for example:

  • adjusting the viewing zones of an autostereoscopic display
  • adjusting the depth according to a user's position or orientation
  • generating different perspectives
  • controlling the image capturing process, e.g. adjusting the camera separation between real (or virtual) cameras used for image generation
  • determining user fatigue
  • performing geometrical corrections, e.g. vertical parallax
  • switching the display between 2D and 3D mode
  • rotating the display toward the viewer
  • generating motion parallax
  • adjusting depth parameters or for crosstalk cancellation

This group also covers display systems which detect the presence of a viewer in front of the display, e.g. by detecting that the shutter glasses are switched on.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Input arrangements or combined input and output arrangements for interaction between user and computer, for example viewer tracking for gesture recognition

G06F 3/01

Special rules of classification

This group should be assigned in combination with a group for the respective display type, when specific properties of the display device are controlled, e.g. when the parallax optic is moved as a function of the user position:

media60.png

(Control of the parallax barrier as a function of the user position)

This group should be assigned in combination with groups under H04N 13/20 for respective stereoscopic picture signal generators when specific properties of the picture signal generator (e.g. camera base line distance, convergence point, zoom or orientation) are controlled as a function of the viewer position with respect to the display screen.

This group should be assigned in combination with groups under H04N 13/10 for respective stereoscopic image processing, when specific image signal properties are controlled as a function of the viewer position.

for two or more viewers
Definition statement

This place covers:

Generation of respective viewing zones for multi-viewer autostereoscopic displays.

Providing different perspectives to different viewers depending on their positions.

Providing different images to different viewers upon detection of multiple users (e.g. for privacy purposes).

Adjusting viewing zones of multi-view image displays when viewed by several viewers.

media61.png

(A multi-user autostereoscopic display)

media62.png

(Stereoscopic display for providing different perspective to different user depending on their position)

media63.png

(A multi-view image display when viewed by several viewers, with backlight 10, light emitting area control unit 300 with barrier part BP and transparent slit part TP, directional control unit 400 with a lenticular lens, display unit 100 with a plurality of color pixels, liquid crystal barrier panel as viewpoint generating unit 200 and image plane IP which includes an illumination area LA and a non-illumination area NLA.)

for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Input arrangements or combined input and output arrangements for interaction between user and computer, e.g. eye tracking input arrangements

G06F 3/01

alternating rapidly the location of the left-right image components on the display screens (for viewing without the aid of special glasses using time variant parallax barriers H04N 13/315; displays for viewing with the aid of special glasses or head-mounted displays using temporal multiplexing H04N 13/341)
Definition statement

This place covers:

  • Temporally multiplexed displays
Relationships with other classification places

Polarisation multiplexing displays, using time alternating display of left and right images and passive polarising glasses, are classified in H04N 13/337.

Autostereoscopic displays, using an array of controllable light sources or a moving aperture or light source when the left and right images are alternately displayed in time, are classified in H04N 13/32.

References
Limiting references

This place does not cover:

Autostereoscopic displays using time-variant parallax barriers

H04N 13/315

Stereoscopic displays for viewing with the aid of special glasses or head-mounted displays [HMD], using temporal multiplexing

H04N 13/341

Special rules of classification

This group should be always assigned in combination with a respective display type.

Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
Definition statement

This place covers:

Display devices forming a visual representation of an object in 3D

The volumetric display creates 3D images by the selective emission, scattering, or relaying of illumination from defined points within the 3D viewing volume.

Relationships with other classification places

Neither holographic nor multi-view displays should be classified in this group.

Most volumetric 3-D displays create 3-D imagery visible to the unaided eye. However, other displays not relying upon additional viewing aids (e.g. glasses) should be classified in their relevant groups. For example, autostereoscopic displays are classified in H04N 13/302, whilst multi-view displays are classified in H04N 13/349.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Autostereoscopic displays

H04N 13/302

Multi-view displays

H04N 13/349

Holographic processes or apparatus using light for obtaining images from holograms

G03H 1/00

the volume being generated by a moving, e.g. vibrating or rotating, surface
Definition statement

This place covers:

Volumetric displays wherein image content is displayed upon, and synchronised with, the position of a moving surface such that the viewer perceives a 3D volume. Examples include swept-volume displays in which a 3D object is decomposed into 2D slices which are sequentially displayed or projected upon a rotating planar surface. If the rate of sequential display and corresponding surface rotation are sufficiently high, the human eye perceives a displayed 3D volume, due to persistence of vision.

media64.png

(The volumetric 3D display includes a transparent enclosure 252, a projection screen 254, rasterization electronics 256, a projection engine 258, and relay optics 260.)

with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
Definition statement

This place covers:

Volumetric displays in which the 3D image volume is decomposed into a series ('stack') of constituent 2D image planes, each of which is displayed individually, for example by separate display units or by changing the depth of focus of the image projection optics. When the 2D image planes are viewed together, or in rapid succession, the viewer perceives a 3D volume.

media65.png

(The 3D display apparatus comprises an array of multiple layers of display units comprising at least two layers of display units 10)

Synchronisation thereof; Control thereof
Definition statement

This place covers:

Synchronisation between left and right images output to a display.

Synchronisation between a temporally varying parallax optic and the corresponding image signal provided to the display.

Synchronisation between shutter glasses and the image display period of a shutter display.

Controlling the position of a parallax optic in order to change the depth resolution.

Controlling a display to switch between different modes of operation.

Controlling shutter glasses to switch off when not in use.

Controlling the display timing, backlight or shutter glasses in order to reduce crosstalk.

Controlling the synchronisation protocols between shutter glasses and shutter type display.

Controlling the number of generated views depending upon user selection or upon the number of detected viewers.

User interfaces for controlling stereoscopic display properties.

Special rules of classification

This group should normally be assigned in combination with a respective display type, for example:

Synchronisation or control aspects for autostereoscopic displays

H04N 13/302

Control arrangements or circuits to produce spatial visual effects, for example rotating displays

G09G 3/00

When classifying in this group, classification in G09G 3/00 should also be considered, particularly if aspects of synchronisation or control are present, which relate to the type of display panel (e.g. whether it is an LCD, an OLED, etc.).

Diagnosis, testing or measuring for television systems or their details
Definition statement

This place covers:

Hardware-related or software-related aspects specific to measuring or testing of values involved in the television signal processing at the transmitter side and/or the receiver side, for analog or digital television signal.

Relationships with other classification places

H04N 17/00 features test techniques for all the devices which belong to the television chain: television cameras, transmission path, television receivers or recorders, distribution systems which are found in H04N 5/00, H04N 7/00, H04N 9/00, H04N 11/00, H04N 21/00.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements for locating faults; Arrangements for electrical testing characterised by what is being tested not provided for elsewhere

G01R 31/00

Testing correct operation of photographic apparatus or parts thereof

G03B 43/00

Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays

G09G 3/006

Monitoring of transmission systems; Testing of transmission systems

H04B 17/00

Special rules of classification

H04N 17/00 features a limited number of CPC symbols and has an associated Indexing Code scheme with additional subdivisions: H04N 17/00

Allocation of CPC symbols and/or Indexing Code symbols:

  • a document containing invention information relating to testing of television systems or details will be given an H04N 17/00 CPC symbol as invention information
  • a document containing additional information relating to testing of television systems or details will be given an H04N 17/00 CPC symbol as invention information
  • a document merely mentioning details of colour television systems will not be given a CPC symbol as invention information, but it may receive an Indexing Code if the disclosure is considered relevant.

Monitoring aspects are also covered in the appropriate main groups, e.g. H04N 5/00, H04N 7/00, H04N 21/00.

Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
Definition statement

This place covers:

  • Methods or arrangements for coding or compressing an input digital video sequence for the purpose of onward transmission (e.g. by broadcasting), or of storage (e.g. at servers, set-top boxes or hard-disks) for subsequent reproduction in viewers' premises.
  • Processing in accordance with standards such as MPEG-x or H.26x.
  • Methods or arrangements for transform coding of static images.
  • The scope of H04N 19/00 and its subgroups is limited to the part of digital video coding and compression strictly comprised between the digital video input and the compressed video output.
Relationships with other classification places
  • Processing of the compressed video (e.g. fragmentation in packet units, encapsulation, medium adaptation for transport, video distribution) is covered by H04N 21/00 or H04H.
  • Processing of not yet compressed video signals or after decoding, such as re-sampling, interpolation, cropping, rotation, is generally covered by G06T, unless it interacts with aspects of processing for compression, in which case it is covered by relevant subgroups of H04N 19/00.
  • Computer graphics compression is covered by G06T 9/00.
  • General compression algorithms are covered by H03M 7/30.
  • Processing of documents or images for scanning, transmission or reproduction (e.g. telefax) is covered by H04N 1/00.
  • Details of digital television cameras, digital television receivers and digital video recorders are covered by H04N 5/00.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Processing of documents or images for scanning, transmission or reproduction (e.g. telefax)

H04N 1/00

Bandwidth or redundancy reduction for scanning, transmission or reproduction of documents or the like, e.g. compression of two-tone or discrete tone static images

H04N 1/41

Colour conversion

H04N 1/60

Studio equipment, e.g. video cameras or devices for controlling television cameras

H04N 5/222

Television receivers

H04N 5/44

Video recording and play (e.g. trick play)

H04N 5/76

Closed circuit TV systems, details of video-surveillance cameras and circuits

H04N 7/18

Stereoscopic or multiview television systems

H04N 13/00

Diagnosis, testing or measuring for television systems

H04N 17/00

Selective content distribution

H04N 21/00

Information retrieval and database structures therefor, e.g. in image databases

G06F 16/00

Pattern recognition

G06F 18/00

General purpose image data processing, e.g. hardware for image processing

G06T 1/00

Geometric image transformation in the plane of the image

G06T 3/00

Image restoration

G06T 5/00

Image analysis, e.g. analysis of motion

G06T 7/00

Image coding

G06T 9/00

2D image generation

G06T 11/00

2D image animation (e.g. sprites in general)

G06T 13/80

3D image rendering

G06T 15/00

3D image modelling

G06T 17/00

Arrangements for image or video recognition or understanding

G06V 10/00

Scenes; Scene-specific elements

G06V 20/00

Character recognition, recognising digital ink or document-oriented image-based pattern recognition

G06V 30/00

Recognition of biometric, human-related or animal-related patterns in image or video data

G06V 40/00

Speech or audio signal analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis

G10L 19/00

Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel, e.g. signal processing for video editing and recording on a special recording medium

G11B 27/00

General data coding

H03M

Details of multimedia broadcast systems

H04H

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

JPEG

Joint Photographic Experts Group

AVC

Advanced Video Coding

SVC

Scalable Video Coding

HEVC

High Efficiency Video Coding

using adaptive coding
Definition statement

This place covers:

Static or dynamic adaptation in the interaction of the different building blocks or processes of the digital video compressor or decompressor, e.g. regulation of the parameters involved in the compression algorithm as a function of the channel capacity or of the desired quality of the reconstructed video signal.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Controlling the complexity of the video stream at the transmitter side, e.g. by scaling the resolution or bitrate of the video stream

H04N 21/2662

Content or additional data management, e.g. controlling the complexity of the video stream at the receiver side

H04N 21/462

Special rules of classification

When classifying in this group, each aspect relating to adaptive coding should, insomuch as possible, be classified in each one of subgroups H04N 19/102, H04N 19/134, H04N 19/169, and H04N 19/189.

media1.png

characterised by the element, parameter or selection affected or controlled by the adaptive coding
Definition statement

This place covers:

The definition of the element, parameter or selection, which is affected by the adaptive coding, wherein element is to be understood as a functional block or process in the digital video compressor or decompressor.

Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
Definition statement

This place covers:

The selection of the reference unit (as contained e.g. in the memories in the figure below) for prediction within a chosen coding or prediction mode, e.g.:

  • weighted prediction
  • adaptive choice of position and number of pixels used for prediction
  • choice between different motion estimators or compensators (e.g. between diamond search and full search, between global and local motion compensation) skip mode, merge mode
  • adaptive choice of the reference frame or block in predictive encoding, e.g. spatial, temporal, interlayer or interview compensation.
  • adaptive reference picture list management
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Non-adaptive reference picture list management

H04N 19/50

Multiple frame prediction

H04N 19/573

Bidirectional image interpolation, B-frames

H04N 19/577

Long-term prediction

H04N 19/58

between spatial and temporal predictive coding, e.g. picture refresh
Definition statement

This place covers:

The selection between spatial and temporal predictive coding, e.g. picture refresh by insertion of an intra-coded frame, as e.g. periodically or at scene change, or decision among intra-mode and inter-mode as in the figure.

media3.png

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Intra-frame, I-frame

Frame coded with spatial prediction

Inter-frame, P-frame

Frame coded with temporal prediction in one temporal direction

Bidirectional-frame, B-frame

Frame coded with temporal prediction in both temporal directions

Anchor frame

A frame usable for prediction of other frames, i.e. an intra-frame or an inter-frame

among a plurality of temporal predictive coding modes
Definition statement

This place covers:

The selection among a plurality of temporal predictive coding modes, e.g. a plurality of inter-prediction modes as in the standard H.263 or H.264.

among a plurality of spatial predictive coding modes
Definition statement

This place covers:

The selection among a plurality of spatial predictive coding modes, e.g. a plurality of intra-prediction modes as the directional block intra-prediction modes in the standard H.264 shown below.

media4.png

according to a given display mode, e.g. for interlaced or progressive display mode
Definition statement

This place covers:

The selection of a given display mode, e.g. interlaced or progressive as in the figure (as in MBAFF of H.264), and of the associated coding or prediction mode.

media5.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Conversion of standards in television systems, e.g. at the pixel level of a picture from interlaced to progressive display mode and vice versa

H04N 7/01

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

MBAFF

Macroblock-adaptive frame-field coding

Adapting the group of pictures [GOP] structure, e.g. number of B-frames between two anchor frames (H04N 19/107 takes precedence)
Definition statement

This place covers:

  • The adaptation of the length or the composition of a GOP, e.g. by changing the number of B-frames between anchor frames or by changing the number of P-frames between I-frames.
  • The selection of the structure of a group-of-pictures [GOP], e.g. of the number of P-frames, B-frames between two anchor frames, e.g. as in the figure below.

media6.png

References
Limiting references

This place does not cover:

The selection between spatial and temporal predictive coding

H04N 19/107

Informative references

Attention is drawn to the following places, which may be of interest for search:

Bidirectional image interpolation, B-frames

H04N 19/577

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Group-of-pictures

A group of successive pictures forming a logical unit within a coded video sequence in H.26x and MPEG standards.

Open GOP

A GOP which uses referenced pictures from the previous GOP at the current GOP boundary.

Closed GOP

A GOP that uses no referenced pictures from the previous GOP at the current GOP boundary (e.g. the classic GOP starting with an I frame).

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

GOF

Group of frames.

GOP

Group of pictures.

Selection of the code volume for a coding unit prior to coding
Definition statement

This place covers:

The selection of the target rate or code volume assigned to a coding unit before coding the unit itself, e.g. to a picture or a group-of-pictures, as done within the rate controller in the figure below, or selection of frame rate.

media7.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Data rate or code amount at the encoder output

H04N 19/146

Filters, e.g. for pre-processing or post-processing (sub-band filter banks H04N 19/635)
Definition statement

This place covers:

Subject matter wherein the filtering is required to be part of an adaptive coding process, e.g. quantization controlling the filtering process, adaptive switching function after filtering process, optional filtering characteristics, adaptive selection of a filter type or of filter parameters, like strength and taps, as within the filter indicated in the figure below in function of a threshold determination.

media8.png

References
Limiting references

This place does not cover:

Sub-band based transform characterised by filter definition or implementation details

H04N 19/635

Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of filtering operations specially adapted for video compression and not necessarily of adaptive nature

H04N 19/80

Pre-processing or post-processing specially adapted for video compression

H04N 19/85

Image enhancement or restoration by use of local operators

G06T 5/20

Impedance networks; Resonators

H03H

Adaptive subdivision aspects, e.g. subdivision of a picture into rectangular or non-rectangular coding blocks
Definition statement

This place covers:

  • Adaptive segmentation aspects during video compression, e.g. ROI segmentation.
  • The selection of the subdivision of a picture into coding blocks, i.e. the determination of the grid of blocks covering a picture.
  • The selection may involve the shape, e.g. rectangular or non-rectangular, or the size of the blocks, e.g. in the standard H.264 with selection among 4 x 4, 4 x 8, 8 x 4, 8 x 8 pixel block sizes as shown in the figures below.

media9.png

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Macroblock

A MPEG coding unit including 16 x 16 pixels subdivided into four 8 x 8 blocks.

Synonyms and Keywords

In patent documents, the following expressions/words block", "sub-block", "tile" are often used as synonyms.

In patent documents the word "tile" is often used in the context of the standard JPEG 2000 and of transform coding of static images.

Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
Definition statement

This place covers:

Selection from a plurality of alternative compression algorithms within a video compressor, e.g.

  • Selection among discrete cosine transforms [DCT] and subband transforms.
  • Selection from a plurality of video compression standards, e.g. selection among H.263 and H.264, selection among MPEG-2 and MPEG-4.
  • Selection between lossy and lossless compression.
  • Transform skip mode (cf., hevc).

media10.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Video compression based on transform coding

H04N 19/60

Special coding techniques and algorithms

H04N 19/90

Selection of transform size, e.g. 8x8 or 2x4x8 DCT; Selection of sub-band transforms of varying structure or type
Definition statement

This place covers:

The selection of transform size within the same predetermined transform algorithm, e.g. 4x4 or 8x8 DCT as in the figure below, or 8x8 or 2x4x8 DCT for frame-based and for field-based block compression, respectively, or sub-band transforms of varying hierarchical structure or type.

media11.png

Quantisation
Definition statement

This place covers:

Subject matter wherein specific details of a controlled quantiser is provided, e.g. frame type or input video characteristics controlling the quantiser, adaptive quantisation based on output or transmission buffer fullness, choice between fine or coarse quantisation.

media12.png

Details of normalisation or weighting functions, e.g. normalisation matrices or variable uniform quantisers
Definition statement

This place covers:

Special algorithms used for quantisation in video compression, e.g. the choice of normalisation parameters or matrices, details of variable uniform quantisers or the calculation of quantisation weighting matrices.

Prioritisation of hardware or computational resources
Definition statement

This place covers:

The control of resource allocation or assignment (e.g. CPU time, memory, allocation of digital processing units, workload distribution among processors), e.g. skipping of encoding or decoding steps or switching off computing or hardware units, like e.g. motion estimation/compensation or transform units.media13.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Filtering control

H04N 19/117

Sampling, masking or truncation of coding units

H04N 19/132

Availability of hardware or computational resources, e.g. adapting coding based on assigned resources

H04N 19/156

Implementation details or hardware specially adapted for video compression or decompression

H04N 19/42

Scanning of coding units, e.g. zig-zag scan of transform coefficients or flexible macroblock ordering [FMO]
Definition statement

This place covers:

The adaptation of the scanning of coding units, e.g. the choice of a zig-zag scan of transform coefficients in a transform compressor, as in the figure, or the use of flexible macroblock ordering [FMO].

media14.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Definition of the coding unit

H04N 19/169

Video coding involving rearrangement of data among different coding units

H04N 19/88

Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
Definition statement

This place covers:

Subject matter wherein the entropy coding is adapted, e.g. frame type determining the coding table, CABAC, CAVLC, adaptive Huffman coding, choosing among different VLC methods for coding as in the figure.

media15.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Non-adaptive entropy coding for video compression

H04N 19/91

Non-adaptive run-length coding for video compression

H04N 19/93

Conversion to or from variable length codes in general

H03M 7/40

Conversion to or from run-length codes in general

H03M 7/46

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

VLC

Variable Length Coding

CABAC

Context-Adaptive Binary Arithmetic Coding

CAVLC

Context-Adaptive Variable Length Coding

Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
Definition statement

This place covers:

Adaptive sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high frequency transform coefficient masking, i.e. suppression or setting to zero, macroblock skipping, as in the figure.

media16.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Adaptive prioritisation of hardware or computational resources

H04N 19/127

Definition of the coding unit

H04N 19/169

Temporal sampling or interpolation for video coding

H04N 19/587

Spatial sampling or interpolation for video coding

H04N 19/59

characterised by the element, parameter or criterion affecting or controlling the adaptive coding
Definition statement

This place covers:

The definition of an element, a parameter or criterion, which exercises the control of an adapted element or selection as classified in H04N 19/102 in the adaptive coding, wherein element is to be understood as a functional block or process in the digital video compressor or decompressor.

Motion inside a coding unit, e.g. average field, frame or block difference
Definition statement

This place covers:

Determination of motion inside a coding unit, e.g. amount of temporal prediction errors, such as average difference calculated on a field, on a frame or on a block in two different time instants.

media17.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Motion estimation or compensation for video compression

H04N 19/51

Analysis of motion in general

G06T 7/20

Analysis of motion vectors, e.g. their magnitude, direction, variance or reliability
Definition statement

This place covers:

The measure of motion performed by explicitly using motion vectors (e.g magnitude, direction, variance, reliability measures).

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Motion estimation or compensation for video compression

H04N 19/51

Analysis of motion in general

G06T 7/20

Coding unit complexity, e.g. amount of activity or edge presence estimation (H04N 19/146 takes precedence)
Definition statement

This place covers:

Determination of coding unit complexity, e.g. by means of an activity detection, as in the figure below by means e.g. of flatness detection or energy of transform coefficients, by means of the detection of edge presence or by means of determination of the amount of spatial prediction error.

media18.png

References
Limiting references

This place does not cover:

Measure of complexity defined by data rate or code amount at the encoder output

H04N 19/146

Detection of scene cut or scene change
Definition statement

This place covers:

The adaptive control of the video compression in response to detected scene cut or change.

media19.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Picture signal circuitry for video frequency region, e.g. scene change detection in television systems

H04N 5/14

Methods involving scene cut or scene change detection in combination with video compression

H04N 19/87

Data rate or code amount at the encoder output
Definition statement

This place covers:

The adaptive control of video compression by using information about the data rate or code amount at the encoder output.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Adaptation of the selection of the code volume for a coding unit prior to coding

H04N 19/115

according to rate distortion criteria (rate-distortion as a criterion for motion estimation H04N 19/567)
Definition statement

This place covers:

The adaptation of encoding as a function of data rate or code amount determined according to rate-distortion criteria, e.g. as a function of a cost function.

References
Limiting references

This place does not cover:

Rate distortion as a criterion for motion estimation

H04N 19/567

Informative references

Attention is drawn to the following places, which may be of interest for search:

Adaptation based on measured or subjectively estimated visual quality after decoding

H04N 19/154

Adaptation using optimisation based on Lagrange multipliers

H04N 19/19

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Cost function

A function of target parameters, as output rate and quality measurement after decoding (e.g. distortion).

by estimating the code amount by means of a model, e.g. mathematical model or statistical model
Definition statement

This place covers:

The estimation of the code amount by means of a model, e.g. a mathematical model or a statistical model, as done in the MPEG-2 Test Model 5 (TM5)

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Methods or arrangements, for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding

H04N 19/189

by monitoring actual compressed data size at the memory before deciding storage at the transmission buffer
Definition statement

This place covers:

The estimation of the code amount by off-line encoding, i.e. encoding without storing at the transmission buffer, e.g. by means of a separate encoder as in the figure below, and counting of the actual data size of the compressed elementary stream.

media20.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Data rate or code amount at the encoder output by estimating the code amount by means of a model

H04N 19/149

by measuring the fullness of the transmission buffer
Definition statement

This place covers:

The control of the video coding by using the measurement of fullness in the transmission buffer, where the buffer may be implicit, as e.g. in the cases of a storage medium, a memory, a physical channel having a certain bit capacity.

media21.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Processing of video elementary streams

H04N 21/234

Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion (use of rate-distortion criteria H04N 19/147)
Definition statement

This place covers:

The control of video coding by means of quality after decoding, as measured, e.g. by means of distortion measurement, or as estimated by means of subjective tests.

This subgroup should be assigned, when quality is not particularly linked to output bit-rate.

References
Limiting references

This place does not cover:

Use of rate-distortion criteria

H04N 19/147

Informative references

Attention is drawn to the following places, which may be of interest for search:

Data rate or code amount at the encoder output, e.g. where the quality measure is directly linked to output bit-rate

H04N 19/146

Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
Definition statement

This place covers:

The control of video coding in dependence of the availability of hardware or computational resources, e.g. encoding based on power-saving criteria, time constrained encoding.

media22.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Prioritisation of hardware or computational resources, e.g. adaptively controlling the assignment of coding resources

H04N 19/127

implementation details or hardware specially adapted for video compression or decompression

H04N 19/42

Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
Definition statement

This place covers:

  • The control of video coding as a function of the coding mode assigned to the unit to be coded, i.e. the coding mode of the unit to be coded is predefined or preselected.
  • The subgroup H04N 19/159 covers the case that the coding mode is the prediction type used for the unit to be coded, e.g. intra, inter or bidirectional, as in the figure directly below.

media23.png

The subgroup H04N 19/16 covers the case that the assigned coding mode is for a given display mode, e.g. for interlaced or progressive display mode, as in the figure directly below.

media24.png

User input
Definition statement

This place covers:

The control of the video encoding by means of the input from a user, e.g. from a user interface.

Feedback from the receiver or from the transmission channel
Definition statement

This place covers:

  • The control of encoding the elementary video stream as a function of the feedback from the client/receiver or from the transmission channel, as e.g. in the figure below.
  • The subgroup H04N 19/166 covers in particular the case that the feedback contains a certain amount of transmission errors, e.g. by means of a bit- or packet-error-rate detection.

media25.png

Relationships with other classification places

The control of encoding as a function of the feedback from the receiver or from the transmission channel in a general telecommunication context is covered in H04L and H04W.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Embedding additional information in the video signal during the compression process

H04N 19/46

Control signalling related to video distribution between receiver, transmitter, and network components

H04N 21/63

Transmission of management data between client and server

H04N 21/65

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

BER

Bit Error Rate

PER

Packet Error Rate

Position within a video image, e.g. region of interest [ROI]
Definition statement

This place covers:

  • The control of the video encoding as a function of a coding unit's position within a video image, e.g. the adoption of coding parameters adapted to a region of interest, different coding of foreground and of background, different coding at the image centre and at the image borders.
  • Adaptive video coding depends generally indirectly on the position within an image, e.g. coding parameters may be varied across coding units, e.g. blocks.
  • The present subgroup covers the case when the spatial position within the image is explicitly and directly defined as a criterion.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Image region as coding unit

H04N 19/17

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

ROI

Region Of Interest

characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
Definition statement

This place covers:

Definition of the video coding units that are controlled by or controlling the adaptive coding. The subgroups of H04N 19/169 define explicitly which coding units are meant.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

with respect to H04N 19/179, referring to scene or shot as coding unit:

Methods involving scene cut or scene change detection in combination with video compression

H04N 19/87

with respect to H04N 19/187, referring to scalable layer as coding unit:

Hierarchical and scalability techniques

H04N 19/30

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

(Video) Object

MPEG-4 object, i.e. a region of the image with arbitrary shape

Slice

A set of blocks within an image, e.g. a line of blocks.

Block

A rectangular matrix of pixels.

Macroblock

MPEG coding unit formed by four blocks arranged as a 2 x 2 matrix.

Group of pictures

MPEG coding unit formed by a set of consecutive pictures.

Scalable video layer

Coding unit of a scalable encoded video elementary stream

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

GOB

Group of Blocks

GOP

Group of Pictures

GOF

Group of Frames

FMO

H.264 Flexible Macroblock Ordering

In patent documents, the following words/expressions are often used as synonyms:

  • "slice" and "GOB"; "block" and "tile"
the unit being an image region, e.g. an object
Definition statement

This place covers:

Adaptive coding applied to regions of interest [ROI].

the region being a slice, e.g. a line of blocks or a group of blocks
Definition statement

This place covers:

Adaptive coding on any groups of blocks as long as these are linked to each other in a well-defined manner, such as slices in AVC and tiles in HEVC.

characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
Definition statement

This place covers:

Special mathematical or algorithmic formulations for the methods or tools used for video coding adaptation.

Special rules of classification

This group is residual with respect to its subgroups.

using optimisation based on Lagrange multipliers
Definition statement

This place covers:

The formulation in terms of optimisation based on Lagrange multiplier techniques, as e.g. in the cost function defined as C = R + LD, where R is the output rate, L is the Lagrange multiplier, and D is the distortion after decoding.

the adaptation method, adaptation tool or adaptation type being iterative or recursive
Definition statement

This place covers:

  • Iterative and recursive algorithms and techniques applied to the adaptation of video coding.
  • The special case of two-pass or two-step algorithms are covered by H04N 19/194.
being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters (processing of motion vectors H04N 19/513)
Definition statement

This place covers:

Details of the mathematical laws or algorithms used for computation of encoding parameters (like e.g. quantisation step, coding mode), e.g. estimating a current encoding parameter by averaging previously computed encoding parameters, deriving the coding mode for the current coding unit from the coding mode of the neighbouring coding units. Neighbouring coding units may relate to views, layers, spatial or temporal neighbours.

References
Limiting references

This place does not cover:

Formulations for processing of calculated motion vectors

H04N 19/513

Informative references

Attention is drawn to the following places, which may be of interest for search:

Formulations for initializing motion vector search

H04N 19/56

using video object coding
Definition statement

This place covers:

Details of object-based video coding, as e.g. according to the standard MPEG-4.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Hierarchical and scalability techniques (cf. H04N 19/29)

H04N 19/30

Processing of video elementary streams in the server, e.g. for generating or manipulating the scene composition of objects

H04N 21/234

Processing of video elementary streams in the server involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements, e.g. by decomposing video signals into objects

H04N 21/2343

Processing of video elementary streams in the client device, e.g. involving rendering scenes according to scene graphs

H04N 21/44

Contour coding

G06T 9/20

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

(Video) Object

MPEG-4 object, i.e. a region of the image with arbitrary shape

Alpha-plane

A discrete bitmap (generally binary) defining the part of a frame constituting a given object, e.g. in terms of the position of the pixels belonging to the object or in terms of the position of the blocks covering the object.

Sprite

A unified background image derived by compositing the backgrounds of the single frames of a video sequence, e.g. having a camera motion throughout a video segment (within e.g. a scene, a shot, a GOP, a sequence). It may be static or dynamic.

Scene description coding

The coded representation of the spatiotemporal positioning of audio-visual objects as well as their behaviour in response to interaction, as e.g. in the standard MPEG-4 Part 11.

Synthetic/natural hybrid coding

Part of the MPEG-4 standard relating to coding facial animation and mesh compression.

Synthetic picture component

A picture component that is coded by geometric modelling with synthesizing at reconstruction (e.g. avatar).

Natural picture component

A picture component that is coded "as it stands" without geometric modelling.

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

BIFS

BInary Format for Scenes

SNHC

Synthetic/Natural Hybrid Coding

VOL

Video Object Layer

VOP

Video Object Plane

In patent documents, the following words/expressions are often used as synonyms:

  • "object", "video object", and "video object plane (VOP)"
using hierarchical techniques, e.g. scalability (H04N 19/63 takes precedence)
Definition statement

This place covers:

  • Details of video coding, where the elementary video stream is coded so that it contains a hierarchy of different compressed representations of the same video sequence, wherein each representation may correspond e.g. to a different video resolution or video format. Layered coding is also covered here.
  • The hierarchy may be incremental, as e.g. in scalable video coding (like the extension of the standard H.264 called Scalable Video Coding [SVC]).
References
Limiting references

This place does not cover:

Transform coding using sub-band based transform, e.g. wavelets

H04N 19/63

Informative references

Attention is drawn to the following places, which may be of interest for search:

Processing of video elementary streams in the server involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements, e.g. by decomposing video signals into layers at the transmitter side

H04N 21/2343

Controlling the complexity of the video stream at the transmitter side, e.g. by scaling the resolution or bitrate of the video stream

H04N 21/2662

Processing of video elementary streams in the client device involving reformatting operations of video signals for household redistribution, storage of real-time display, e.g. by decomposing video signals into layers at the receiver side

H04N 21/4402

Content or additional data management, e.g. controlling the complexity of the video stream at the receiver side

H04N 21/462

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Temporal scalability

Scalability in terms of frame rate, meaning that a given bit stream includes different sub-streams each with a different frame rate or sub-streams that, when combined, increase the output frame rate.

Spatial scalability

Scalability in terms of spatial video sampling rate or resolution (e.g. quantisation step size, pixel bit depth), meaning that a given bit stream includes different sub-streams each with a different frame size or resolution or sub-streams that, when combined, increase the output frame size or resolution.

in the temporal domain
Definition statement

This place covers:

Performing hierarchical or layered coding by acting on temporal resolution, e.g. temporal scalability.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Predictive coding using temporal sub-sampling or interpolation

H04N 19/587

in the spatial domain
Definition statement

This place covers:

Performing hierarchical or layered coding by acting on spatial resolution, e.g. spatial scalability.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Predictive coding involving spatial sub-sampling or interpolation

H04N 19/59

with arrangements for assigning different transmission priorities to video input data or to video coded data
Definition statement

This place covers:

The preliminary organisation of the video elementary stream with assignment of different priorities or importance to data to be further transmitted, e.g. for transmission or dropping.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Error resilience techniques for digital video coding involving data partitioning

H04N 19/66

Control signalling in networks for selective content distribution, e.g. multimode transmission

H04N 21/63

Cryptographic protocols

H04L 9/00

Network security protocols

H04L 63/00

Protocols for real-time services in data packet switching networks

H04L 65/00

Network protocols for data switching network services

H04L 67/00

using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
Definition statement

This place covers:

Transcoding of the elementary video stream at the level of digital video coding, i.e. partial or full decoding of a coded input stream and re-encoding of the decoded output stream.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Video standard conversion at the pixel level, e.g. for analog television

H04N 7/01

Video conference systems, e.g. reformatting video signals

H04N 7/15

Processing of video elementary streams at a server involving reformatting operations of video signals

H04N 21/2343

Processing of video elementary streams at a client device involving reformatting operations of video signals

H04N 21/4402

Information retrieval, e.g. distillation of HTML documents for optimising the visualization of content or computer file format conversion

G06F 16/00

Cryptographic protocols

H04L 9/00

Network security protocols

H04L 63/00

Protocols for real-time services in data packet switching networks

H04L 65/00

Network protocols for data switching network services

H04L 67/01

characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation (H04N 19/635 takes precedence)
Definition statement

This place covers:

Implementation details or hardware specific for elementary video compression or decompression, e.g. dedicated software implementation, memory arrangements, parallel processing or hardware for motion estimation or compensation.

References
Limiting references

This place does not cover:

Filter definition or implementation details for defining sub-band transforms

H04N 19/635

Informative references

Attention is drawn to the following places, which may be of interest for search:

Decoder specific implementations

H04N 19/44

Binary arithmetic

G06F 7/60

Execution of machine instructions

G06F 9/30

Pipelines

G06F 9/38

Resource allocation

G06F 9/50

Transfer of information, buses

G06F 13/00

Digital computing

G06F 17/00

Complex mathematical operations

G06F 17/10

Software or hardware implementations of Fourier, Walsh or analogous domain transformations

G06F 17/14

characterised by memory arrangements (H04N 19/433 takes precedence)
Definition statement

This place covers:

  • Details of memory arrangements or management specifically dedicated to video compression.
  • The subgroup H04N 19/426 covers details of memory downsizing techniques.
References
Limiting references

This place does not cover:

Techniques for memory access in motion estimation or compensation

H04N 19/433

Informative references

Attention is drawn to the following places, which may be of interest for search:

Accessing, addressing or allocating within memory systems or architectures in general

G06F 12/00

Memory management for general purpose image data processing

G06T 1/60

Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators, e.g. display memories

G09G 5/00

Static storage for general purpose data processing, e.g. memories, shift registers

G11C

Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
Definition statement

This place covers:

Video decoders not symmetric with the corresponding encoders, i.e. decoding means or steps are not a mere reversal of the corresponding encoding means or steps, or specific hardware or software implementations details for the video decoder.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Implementation details or hardware specific for video encoding and decoding

H04N 19/42

Complex mathematical operations

G06F 17/10

Embedding additional information in the video signal during the compression process (H04N 19/517, H04N 19/68, H04N 19/70 take precedence)
Definition statement

This place covers:

  • Subject matter wherein additional information is provided and transmitted within the compressed video signal, e.g. flag information or ancillary encoding information without details of syntax related data structure, watermarking.
  • Encoding parameters are generally included for transmission in the video elementary stream.
  • This group or its subgroups should be assigned if special details are provided about their insertion for transmission in the stream, e.g. compression is covered by H04N 19/463.
References
Limiting references

This place does not cover:

Motion vector coding and transmission

H04N 19/517

Insertion of resynchronisation markers into the bitstream

H04N 19/68

Syntax aspects related to video coding

H04N 19/70

characterised by the embedded information being invisible, e.g. watermarking
Definition statement

This place covers:

Details of the embedding of additional information during the coding process, which is embedded into the image part or into the auxiliary information of the elementary video bit stream in order to be invisible, e.g. by watermarking.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuits or arrangements for control or supervision between transmitter and receiver, e.g. display, printing, storage or transmission of additional information in scanning, transmission or reproduction of documents or the like

H04N 1/32

Generation or processing of content or additional data for video distribution by content creator independently of the distribution process; Content for video distribution per se

H04N 21/80

Generation of protective data involving watermarking as additional data for video distribution

H04N 21/8358

using compressed domain processing techniques other than decoding, e.g. modification of transform coefficients, variable length coding [VLC] data or run-length data (motion estimation in a transform domain H04N 19/547; processing of decoded motion vectors H04N 19/513)
Definition statement

This place covers:

Details of compressed domain processing techniques other than decoding, e.g. modification of transform coefficients, of VLC data or of run-length data, filtering in the compressed domain.

media26.png

References
Limiting references

This place does not cover:

Processing of decoded motion vectors

H04N 19/513

Motion estimation in a transform domain

H04N 19/547

using predictive coding (H04N 19/61 takes precedence)
Definition statement

This place covers:

Predictive digital video coding techniques not otherwise provided in other subgroups.

References
Limiting references

This place does not cover:

Transform coding (constitutes a significant non trivial detail) used in combination with predictive coding

H04N 19/61

involving temporal prediction (adaptive coding with adaptive selection between spatial and temporal predictive coding H04N 19/107; adaptive coding with adaptive selection among a plurality of temporal predictive coding modes H04N 19/109)
Definition statement

This place covers:

  • Predictive digital video coding techniques involving temporal prediction not otherwise provided in other subgroups.
  • Details of temporal prediction are classified here.
References
Limiting references

This place does not cover:

Adaptive coding with adaptive selection between spatial and temporal predictive coding

H04N 19/107

Adaptive coding with adaptive selection among a plurality of temporal predictive coding modes

H04N 19/109

using conditional replenishment
Definition statement

This place covers:

  • Temporal predictive coding using conditional replenishment, i.e. transmitting only a portion of a picture, in which a change has been detected with respect to the corresponding co-located portion of the immediately previous picture.
  • Conditional replenishment may be seen also as motion compensated temporal predictive encoding, using only skipping or transmission with zero motion vector.
Motion estimation or motion compensation
Definition statement

This place covers:

  • Details of disparity estimation and compensation in stereoscopic or multi-view video coding are also covered in this subgroup and in its subgroups. For a synopsis of motion estimation techniques in video coding, see the figure below.

media27.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Picture signal circuitry for video frequency region, e.g. for movement detection in television systems not related to digital video coding

H04N 5/14

Conversion of standards for analogue television systems, e.g. at pixel level involving interpolation processes involving the use of motion vectors

H04N 7/01

Analysis of motion by image analysis in general

G06T 7/20

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Motion vector

A two-dimensional vector used for inter prediction that provides an offset from the coordinates in the decoded picture to the coordinates in a reference picture.

Global motion estimation

Process to estimate the part of motion in a video sequence caused by camera motion, e.g. background motion by panning or zooming.

Multiresolution motion estimation

Motion estimation performed on the same picture of a video sequence at different spatial sampling resolutions (coarse-to-fine: starting from the lowest resolution; fine-to-coarse: starting from the highest resolution).

Block-based matching motion estimation

Classic motion estimation based on the search of a best matching block in a reference frame.

Occlusion

A part of background or of a foreground object that is hidden in one frame and then uncovered in a following frame.

(Motion) Search window

A region in a reference frame, where the search for the block or feature best matching the current block or feature is performed.

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

MV

Motion Vector

GMV

Global Motion Vector

MAE

Mean Absolute Error

MAD

Mean Absolute Difference

SAD

Sum of Absolute Differences

MSE

Mean Squared Error

CCF

Cross-Correlation Function

PDC

Pixel Difference Classification

DFD

Displaced Frame Difference

In patent documents, the following words/expressions are often used as synonyms:

  • "reference frame" and "anchor frame"
Processing of motion vectors
Definition statement

This place covers:

  • Subject matter wherein the determined or existing motion vectors are subjected to further processing or modification, e.g. scaling of motion vectors for scalability or transcoding purposes, encoding of motion vectors, reducing or dropping of motion vectors.
  • Motion vector coding and predictive coding is covered in the subgroups.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Processing of encoding parameters different from motion vectors

H04N 19/46

using feature points or meshes
Definition statement

This place covers:

Motion estimation wherein motion vectors are attached to specific feature points or points of a mesh, e.g. affine motion models.

Motion estimation based on rate distortion criteria
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Rate distortion as a criterion for adaptive coding

H04N 19/147

Motion compensation with multiple frame prediction using two or more reference frames in a given prediction direction
Definition statement

This place covers:

Uni-directional or bi-directional motion compensation with more than one reference frame per direction

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Bi-directional motion frame interpolation

Temporal interpolation where a frame is predicted as a function both of a preceding anchor frame and of a succeeding anchor frame, e.g. by averaging.

Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
Definition statement

This place covers:

Bi-directional motion compensation with one or more than one reference frame per direction

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Bi-directional motion frame interpolation

Temporal interpolation where a frame is predicted as a function both of a preceding anchor frame and of a succeeding anchor frame, e.g. by averaging.

Motion compensation with long-term prediction, i.e. the reference frame for a current frame not being the temporally closest one (H04N 19/23 takes precedence)
Definition statement

This place covers:

Prediction of a frame (Ppred) from an anchor frame (Panc) that is not the closest anchor frame preceding or succeeding the frame to be predicted, cf. figure.

media28.png

References
Limiting references

This place does not cover:

Video object coding with coding of regions that are present throughout a whole video segment, e.g. sprites, background or mosaic

H04N 19/23

involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
Definition statement

This place covers:

Sub-sampling or interpolation in the temporal domain during digital video compression or decompression.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Conversion of standards for analogue television systems, at pixel level involving interpolation processes

H04N 7/01

Adaptive sampling for adaptive digital video coding

H04N 19/132

Video compression using hierarchical techniques in the temporal domain

H04N 19/31

involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
Definition statement

This place covers:

  • Sub-sampling or interpolation in the spatial domain during digital video compression or decompression.
  • Details of sub-sampling or interpolation operations during motion estimation and compensation with sub-pixel accuracy are also covered here.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Conversion of standards for analogue television systems, at pixel level involving interpolation processes

H04N 7/01

Adaptive sampling for adaptive digital video coding

H04N 19/132

Video compression using hierarchical techniques in the spatial domain

H04N 19/33

Motion estimation or motion compensation with sub-pixel accuracy

H04N 19/523

Scaling the whole image or part thereof, e.g. by interpolation based image scaling

G06T 3/40

involving spatial prediction techniques
Definition statement

This place covers:

Digital video compression involving spatial prediction techniques, e.g. details of intra prediction.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Adaptive coding with adaptive selection between spatial and temporal predictive coding

H04N 19/107

Adaptive coding with adaptive selection among a plurality of spatial predictive coding modes

H04N 19/11

specially adapted for multi-view video sequence encoding
Definition statement

This place covers:

Details of stereoscopic or multi-view digital video coding including processing (e.g. compression) of depth maps.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Motion estimation or compensation, e.g. details of vector based interview estimation and compensation.

H04N 19/51

using transform coding
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Fourier, Walsh or analogous domain transformations in general, e.g. implementation details of DCT or wavelet transforms

G06F 17/14

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

DCT

Discrete Cosine Transform

KLT

Karhunen-Loève Transform

DST

Discrete Sine Transform

FFT

Fast Fourier Transform

WLT

Wavelet Transform

MCTF

Motion Compensated Temporal Filtering

EZW

Embedded Zerotrees of Wavelets

In patent documents, the following words/expressions are often used as synonyms:

  • "discrete cosine transform" and "cosine transform"
{the transform being operated outside the prediction loop}
Definition statement

This place covers:

Transform based predictive video coders of the type displayed in the figure below, i.e. where the transform is operated before or after the prediction loop.

media29.png

characterised by filter definition or implementation details
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Implementation details or hardware specially adapted for video compression or decompression

H04N 19/42

using error resilience
Definition statement

This place covers:

Techniques applied at the level of encoding the elementary video stream for the purpose of increasing the error resilience thereof.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems for detection or correction of transmission errors in the transmission of television signals using pulse code modulation

H04N 19/89

Selective content distribution, e.g. error resilience techniques for storage at video servers or for channel coding adapted to video distribution

H04N 21/00

Channel coding of digital bit-stream for video distribution

H04N 21/2383

Coding, decoding or code conversion, e.g for error correction in general

H03M 13/00

Arrangements for detecting or preventing errors in the information received, e.g. preventing errors by adapting the channel coding

H04L 1/00

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Resynchronisation marker

A special Variable Length Coding binary word inserted to allow re-initialisation of VLC decoding, which is forced by the marker.

Reversible Variable Length Coding

VLC allowing backward decoding of the stream, i.e. decoding of a VLC coded binary string starting from the end to the beginning.

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

Resync marker

Resynchronisation marker

RVLC

Reversible Variable Length Coding

UEP

Unequal Error Protection

characterised by syntax aspects related to video coding, e.g. related to compression standards
Definition statement

This place covers:

Subject matter wherein details about standards related coding syntax or about using the syntax in the coding process are provided, e.g. H.264 supplemental enhancement information [SEI], headers definitions, details of elementary stream parsing.

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Syntax

The definition of the binary codes and values that make up a conforming elementary video bit stream.

Semantics

The definition of the meaning of the syntax and of the process flow for decoding the syntax elements to produce the digital video output.

Profile/Level

Operational level of a standard compliant decoder, which uses a predefined subset of the features defining the complete decoder according to the standard. The definition of the predefined subset falls also within the prescriptions of the standard.

Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation (H04N 19/635, H04N 19/86 take precedence)
Definition statement

This place covers:

Subject matter wherein a filtering operation specifically adapted to video compression is included but not necessarily adaptive in the video compression or decompression process, with details of the filtering operation provided.

References
Limiting references

This place does not cover:

Filter definition or implementation for sub-band based transform

H04N 19/635

Filtering for removal of coding artifacts

H04N 19/86

Informative references

Attention is drawn to the following places, which may be of interest for search:

Adaptive filtering operation

H04N 19/117

Pre-processing or post-processing specially adapted for video compression

H04N 19/85

Image filtering for image enhancement or restoration using local operators

G06T 5/20

Impedance networks, e.g. resonant circuits, filters in general

H03H

involving filtering within a prediction loop
Definition statement

This place covers:

  • The insertion of the filtering within a prediction loop and details of such filter.
  • This subgroup is of relevance, only if it contributes to define non trivial details of the filtering operation as in-loop filtering, regardless whether the filtering is adapted in the sense of H04N 19/117 or not.

media30.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Adaptive filtering operation

H04N 19/117

Filter definition or implementation for sub-band based transform

H04N 19/635

using pre-processing or post-processing specially adapted for video compression
Definition statement

This place covers:

  • Subject matter wherein the pre or post processing operation is present as a functional block but not necessarily adaptive in the video coding process, e.g. the pre or post processing is respectively performed prior to the input of, or after the output of, the video coding process.
  • This subgroup is of relevance, only if the subject-matter to be classified contributes to define non trivial details of pre- or post-processing, regardless whether the filtering is adapted in the sense of H04N 19/117 or not.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

In-loop filtering

H04N 19/82

involving reduction of coding artifacts, e.g. of blockiness
Definition statement

This place covers:

Processing techniques (e.g. filtering or interpolation in the spatial or in the temporal domain) adapted to reduce artefacts caused by digital video compression, e.g. blockiness from block-based transform compression, frame freeze or jerkiness from dropping frames at compression or transmission, false contours from limited bit depth resolution.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for suppressing or minimising disturbance (e.g. moiré, halo) in television systems

H04N 5/21

In-loop filtering

H04N 19/82

Filtering or interpolation as an error concealment technique

H04N 19/895

involving scene cut or scene change detection in combination with video compression
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Picture signal circuitry for video frequency region, e.g. circuitry for scene change detection in television systems.

H04N 5/14

Scene cut detection in adaptive video coding

H04N 19/142

involving rearrangement of data among different coding units, e.g. shuffling, interleaving, scrambling or permutation of pixel data or permutation of transform coefficient data among different blocks
Definition statement

This place covers:

Techniques for the rearrangement of data among different coding units at the level of a single elementary video stream within the operation of the video coder, e.g. shuffling, interleaving, scrambling, permutation of pixel data or permutation of transform coefficient data among different blocks.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Analogue secrecy systems in television systems

H04N 7/16

Adaptive scanning of coding units

H04N 19/129

Processing of video elementary streams for video distribution involving video stream encryption at the transmitter side

H04N 21/2347

Processing of video elementary streams involving video stream decryption

H04N 21/4405

Processing of video elementary streams involving video stream encryption at the receiver side

H04N 21/4408

involving methods or arrangements for detection of transmission errors at the decoder
Definition statement

This place covers:

  • Techniques for detecting transmission errors at the digital video decoder and at the level of the elementary video stream.
  • The subgroup H04N 19/895 covers details of detection in combination with error concealment.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Decoders specifically adapted for coding, decoding, compressing or decompressing digital video signals

H04N 19/44

Methods or arrangements, for coding, decoding, compressing or decompressing digital video signals using error resilience

H04N 19/65

Interfacing the downstream path of the transmission network originating from a server, e.g. channel decoding in selective content distribution

H04N 21/438

Monitoring of processes or resources, e.g. of downstream path of the transmission network at the receiver side

H04N 21/442

Monitoring of client processing errors or hardware failure in selective video distribution

H04N 21/4425

Control signalling between network components and server or clients, e.g. monitoring network process errors by the network

H04N 21/647

Coding, decoding or code conversion for error detection or error correction in general

H03M 13/00

using coding techniques not provided for in groups H04N 19/10-H04N 19/85, e.g. fractals
References
Limiting references

This place does not cover:

Methods or arrangements, for coding, decoding, compressing or decompressing digital video signals

H04N 19/10

using adaptive coding

H04N 19/10

using video object coding

H04N 19/20

using hierarchical techniques, e.g. scalability

H04N 19/30

using video transcoding

H04N 19/40

Implementation details or hardware specially adapted for video compression or decompression

H04N 19/42

Decoders specifically adapted for coding, decoding, compressing or decompressing digital video signals

H04N 19/44

Embedding additional information in the video signal during the compression process

H04N 19/46

using compressed domain processing techniques other than decoding

H04N 19/48

using predictive coding

H04N 19/50

using transform coding

H04N 19/60

using error resilience

H04N 19/65

characterised by syntax aspects related to video coding

H04N 19/70

Details of filtering operations specially adapted for video compression

H04N 19/80

Pre-processing or post-processing specially adapted for video compression

H04N 19/85

Entropy coding, e.g. variable length coding [VLC] or arithmetic coding
Definition statement

This place covers:

Subject matter wherein the entropy coding is especially adapted to video compression, e.g. specifics of table entries for fixed and variable length coding, details of MPEG Huffman coding, details of H.264 arithmetic coding.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]

H04N 19/13

Run-length coding for video compression

H04N 19/93

Conversion to or from variable length codes in general

H03M 7/40

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

VLCA

Variable Length Coding

Run-length coding
Definition statement

This place covers:

  • Subject matter wherein the run-length coding is especially adapted to video compression.
  • In run-length coding a run, i.e. a sequence of identical data values, is coded by a representation of the data value together with the length of the sequence.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Variable length coding in an adaptive video coding process

H04N 19/13

Conversion to or from run-length codes in general

H03M 7/46

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

RLE

Run-Length Encoding

Vector quantisation
Definition statement

This place covers:

Video compression using vector quantisation, i.e. by dividing a large set of points into groups (vectors) having approximately the same number of points closest to them and by representing each group by a single code, which is associated with its centroid point.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Compression in general, e.g. vector coding

H03M 7/30

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

VQ

Vector Quantisation

Tree coding, e.g. quad-tree coding
Definition statement

This place covers:

  • Video compression using tree coding.
  • Two-dimensional tree coding is called quad-tree coding and is performed by partitioning an image or a video frame by recursively subdividing it into four quadrants or regions, until each region may be represented by a single colour or code word, and coding the resulting tree data structure in which each internal node has exactly four children and each termination node (leaf) corresponds to a resulting region with the colour or code word associated to it, cf. R. Finkel and J.L. Bentley (1974). "Quad Trees: A Data Structure for Retrieval on Composite Keys". Acta Informatica 4 (1): 1–9.
  • Tree coding in higher dimension is defined correspondingly (e.g. octree, performed in three-dimensions by subdivision into eight volumetric regions).

media31.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Image coding using tree coding, e.g. quadtree, octree

G06T 9/40

Matching pursuit coding
Definition statement

This place covers:

Video compression using matching pursuit coding, cf. G. Mallat and Z. Zhang, "Matching Pursuits with Time-Frequency Dictionaries", IEEE Transactions on Signal Processing, December 1993, pp. 3397–3415.

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

MP

Matching Pursuit

Adaptive-dynamic-range coding [ADRC]
Definition statement

This place covers:

  • Video compression using adaptive-dynamic-range coding, cf. Kondo et al., "Adaptive dynamic range coding scheme for future HDTV digital VTR", Proceedings of Signal Processing of HDTV, III. Fourth International Workshop on HDTV and Beyond, Turin, Italy, 4-6 Sept. 1991, p. 43-50.

The term "adaptive" in the "Adaptive-Dynamic-Range Coding" refers to the dynamic range being adaptive and not to the coding being adaptive, which is covered by H04N 19/10 and subgroups.

media32.png

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

ADRC

Adaptive-Dynamic-Range Coding

{involving fractal coding}
Definition statement

This place covers:

  • Lossy video compression using fractal algorithms, as described in Y. Fisher, D. N. Rogovin and T.-P. J. Shen, "Fractal (Self-VQ) Encoding of Video Sequences", Proc. of the Conference on Visual Communications and Image Processing '94, Chicago, IL, USA, 25-29 Sept. 1994, SPIE, vol. 2308, p.1359-1370 (1994).
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Methods for coding digital video signals using vector quantisation

H04N 19/94

Selective content distribution, e.g. interactive television or video on demand [VOD] (real-time bi-directional transmission of motion video data H04N 7/14 {; broadcast or conference over packet switching networks H04L 12/18})
Definition statement

This place covers:

  • Interactive video distribution processes, systems, or elements thereof, which are characterised by point-to-multipoint system configurations, and which are mainly used for motion video data unidirectional distribution or delivery resulting from interactions between systems operators, e.g. access or service providers, or users e.g. subscribers, and system elements
  • Such systems include dedicated communication systems, such as television distribution systems, which primarily distribute or deliver motion video data in the manner indicated, which may, in addition, provide a framework for further, diverse data communications or services in either unidirectional or bi-directional form. However, video will occupy most of the downlink bandwidth in the distribution process.
  • Typically, system operators interface with transmitter-side elements or users' interface with receiver-side elements in order to facilitate, through interaction with such elements, the dynamic control of data processing or data flow at various points in the system. This interaction is typically occasional or intermittent in nature.
  • Processes, systems or elements thereof specially adapted to the generation, distribution and processing of data, which is either associated with video content, e.g. metadata, ratings, or related to the user or his environment and which has been actively or passively gathered. This data is either used to facilitate interaction or to alter or target the content.
Relationships with other classification places
  • H04N 21/00 is an application place for a large number of IT technologies, which are covered by the corresponding functional places
  • Video servers and clients use internally specific computing techniques. Corresponding techniques used in general computing are found in G06F. This concerns data storage, software architectures, error detection or correction, monitoring, video retrieval, browsing, Internet browsing, computer security, billing or advertising
  • Video servers and clients use specific telecommunication techniques for the video distribution process. Corresponding techniques used in generic telecommunication networks are found in subclasses H04B, H04H, H04L, H04W. This concerns monitoring or testing of transmitters/receivers, synchronisation in time-division multiplex, broadcast or multicast, maintenance, administration, testing, data processing in data switching networks, home networks, real-time data network services, data network security, applications for data network, wireless networks per se.
References
Limiting references

This place does not cover:

Real-time bi-directional transmission of motion video data

H04N 7/14

Informative references

Attention is drawn to the following places, which may be of interest for search:

Synchronising circuits with arrangements for extending range of synchronisation at the transmitter end

H04N 5/067

Television picture signal circuitry for Scene change detection

H04N 5/147

Reproduction of recorded television signals

H04N 5/76

Interface circuits between an apparatus for recording television signals and a television receiver

H04N 5/775

Television signal recording using magnetic recording on tape for reproducing at a rate different from the recording rate

H04N 5/783

Conversion of standards in analog television systems

H04N 7/01

Adaptations for transmission by electric cable for domestic distribution in television systems

H04N 7/106

Signal processing in analog two-way television systems

H04N 7/173

Reproduction of recorded television signals

H04N 9/79

Diagnosis, testing or measuring for television receivers

H04N 17/04

Systems for the transmission of television signals using pulse code modulation using bandwidth reduction involving transcoding

H04N 19/40

Flight-deck installations for entertainment or communications

B64D 11/0015

Resetting in general

G06F 1/14

Constructional details of equipment or arrangements specially adapted for portable computer application

G06F 1/1626

Power management in computer systems

G06F 1/3203

Input arrangements for interaction with the human body based on nervous system activity detection

G06F 3/015

Interaction techniques for graphical user interfaces

G06F 3/048

Storage management

G06F 3/0604

RAID arrays per se

G06F 3/0689

Interfaces to printers

G06F 3/12

Digital output for controlling a plurality of local displays

G06F 3/1423

Software architectures; Program control

G06F 9/44, G06F 9/46

Error detection or correction; Monitoring

G06F 11/00

Addressing or allocating within memory systems or architectures

G06F 12/02

Prefetching while addressing of a memory level in which the access to the desired data or data block requires associative addressing means within memory systems or architectures

G06F 12/0862

Retrieval of video data

G06F 16/70

Retrieval from the web

G06F 16/95

Computer security

G06F 21/00

Printing data

G06K 15/02

Computer systems using learning methods

G06N 3/08

Billing; Advertising

G06Q 20/00, G06Q 30/00

Banking in general

G06Q 30/02

Image watermarking in general

G06T 1/0021

Image enhancement or restoration in general

G06T 5/00

Methods or arrangements for recognising scenes

G06V 20/00

Methods or arrangements for recognising human body or animal bodies or body parts

G06V 40/10

Methods or arrangements for acquiring or recognising human faces, facial parts, facial sketches, facial expressions

G06V 40/16

Methods or arrangements for recognising movements or behaviour

G06V 40/20

Adapting incoming signals to the display format of the display terminal

G09G 5/005

Details of formatting and decoding of an encoded audio signal representation into a data stream for transmission or storage purposes

G10L 19/167

Details of audio signal transcoding

G10L 19/173

Arrangements for data linking, networking or transporting, or for controlling an end to end session in a satellite broadcast system

H04B 7/18526

Arrangements for wireless networking or broadcasting of information in indoor or near-field type systems

H04B 10/114

Monitoring or testing of transmitters/receivers

H04B 17/00

Broadcast communication

H04H

Synchronisation in time-division multiplex

H04J 3/06

Allocation of channels according to the instantaneous demands of the users in time-division multiplex systems

H04J 3/1682

Arrangements for detecting or preventing errors in the information received by adapting the channel coding

H04L 1/0009

ARQ protocols

H04L 1/18

Arrangements for synchronising receiver with transmitter

H04L 7/00

Arrangements for synchronising receiver with transmitter by comparing receiver clock with transmitter clock

H04L 7/0012

Arrangements for synchronising receiver with transmitter wherein the receiver takes measures against momentary loss of synchronisation

H04L 7/0083

Key distribution for secret or secure communication

H04L 9/08

Key distribution for secret or secure communication, using a key distribution center, a trusted party or a key server

H04L 9/083

Arrangements for secret or secure communication including means for verifying the identity or authority of a user of the system

H04L 9/32

Charging arrangements in data networks

H04L 12/14

Broadcast or multicast in data switching networks

H04L 12/18

Data processing in data switching networks

H04L 12/56

Analog front ends or means for connecting modulators, demodulators or transceivers to a transmission line

H04L 27/0002

Maintenance or administration in data switching networks

H04L 41/00

Message switching systems

H04L 51/00

Data network security

H04L 63/00

Real-time data network services

H04L 65/00

Network arrangements or protocols in data packet switching networks for supporting network services or applications

H04L 67/00

Wireless networks

H04W

Special rules of classification

The classification scheme has a matrix structure and symbols taken from its different cells allow classification of the relevant aspects of a document as seen below.

In cases where a document is not teaching clearly whether operations are performed on the server-side or client-side, it should be placed by default in the server part of the scheme (H04N 21/20). Further, if both embodiments (server-side, client-side) are present, symbols from H04N 21/20 and H04N 21/40 should be allocated.

Architectures

H04N 21/214

Specialised

H04N 21/218

Source

H04N 21/222

Secondary

H04N 21/226

Hardware

Elementary / Bitstream Operation

Management operations

H04N 21/251

Learn

H04N 21/254

Shop

H04N 21/258

Client Data

H04N 21/262

Scheduling

H04N 21/266

Content Management

End-User App

H04N 21/274

Storing Data

H04N 21/278

Directory Services

H04N 21/40 Client

Architectures

H04N 21/4104

Peripherals

H04N 21/414

Specialised

H04N 21/418

External

H04N 21/422

Input

H04N 21/426

Internal

Elementary / Bitstream Operation

H04N 21/4305

SYNC

H04N 21/431

Visual Interface

H04N 21/432

Retrieval

H04N 21/433

Storage

H04N 21/434

DeMultiplex

H04N 21/435

Add Data

H04N 21/436

Local Net

H04N 21/437

Upstream

H04N 21/438

Downstream

H04N 21/44

Video

H04N 21/441

User ID

H04N 21/442

Monitoring

H04N 21/443

OS

Management operations

H04N 21/4508

User Data

H04N 21/454

Filtering

H04N 21/458

Scheduling

H04N 21/462

Data Management

H04N 21/466

Learn

End-user App

H04N 21/472

Request

H04N 21/475

Inputting

H04N 21/478

Add Services

H04N 21/482

Selection

H04N 21/485

Configuration

H04N 21/488

Data Services

for details of servers or processes related to the reception of the content from the content provider or related to the distribution of content to clients. Network interfaces are included but not the communication aspects with clients

for structural details of client devices or processes related among others to the processing, storing or displaying of the received content as well as user interfaces for accessing video services

for the nature of the downlink / uplink or the exchange of control signals or data between clients, servers, network

for specific multimedia content or processes taking place before distribution (usually by the content provider) and independently according to their appropriate layer:

System architecture and topology

Functional and application aspects related to bit-stream processing or elementary operations

Functional and application aspects related to system management

Services and functionalities offered to the end-user

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Additional data

designates still pictures, textual, graphical or executable data such as software. It is used to convey supplemental information and can be generated prior to or during the distribution process itself, e.g. metadata, keys

Content

designates video or audio streams, which may be combined with additional data. Video data will always be present and occupy most of the downlink bandwidth in the distribution process

Server

designates an apparatus designed for adapting the content received from the content provider to the distribution network. It also manages the distribution to client devices or intermediate components over a network. Further servers may also be present for gathering or generating additional data, e.g. rights management server

Additional data server

designates a server, which sole purpose is the distribution or management of additional data. It is not in charge of the distribution of video or audio data

Client

designates an apparatus such as a TV receiver, a set-top-box, a PC-TV, a mobile appliance (e.g. mobile phone or receiver in a vehicle), for receiving video, audio and possibly additional data from one or several servers or intermediate components via a network for further processing, storing or displaying. It can also transmit this data on a home-based local network to further devices, e.g. a home server transmitting video to PCs and set-top-boxes within a home

Local network

pertains to a restricted area, e.g. a home or a vehicle, and designates the link between a client and its peripheral devices

Network

is to be distinguished from "local network": "network" designates the link between the server and the clients, or between the server and the intermediate components, or between the intermediate components and the clients, or between remotely located clients

Distribution

encompasses broadcasting, multicasting and unicasting techniques for transmitting content from one or more sources to one or more receiving stations. The distribution follows a request by a receiving station to the source, e.g. VOD or from a customization of the content by the source, e.g. targeting advertisements to a demographic group in a unidirectional or bidirectional system. Additionally, distribution encompasses techniques where the client acts as a source and another client acts as a receiving station, e.g. a peer-to-peer system for sharing video among client devices

End-user

designates a physical person, e.g. a TV viewer, who consumes the content using the client device. He is the final recipient of the content distributed by the server

Interaction

covers actions occurring between or among two or more objects that have an effect upon one another, wherein objects comprise users, system operators, system elements, or content. The user may interact with content locally at the client device, e.g. for requesting additional data stored within the client device. The user may interact with content remotely through a server e.g. for VOD playback control or for uploading video to a server. The client device may interact with the content e.g. selecting content based upon the user profile. The client device may interact with a server using a return channel, e.g. for authenticating client or uploading client hardware capabilities. The server may interact with a client device, e.g. to force a client to tune to an advertisement channel

Upstream

designates the direction of data flow towards the source, e.g. a server receiving a request via a mobile phone network

Downstream

designates the direction of data flow towards a client, e.g. a client receiving data originating from a server

Elementary stream

An elementary stream (ES) as defined by the MPEG communication protocol designates the output of an audio or video encoder

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

VOD

Video On Demand

SI

Service Information

IP

Internet Protocol

OS

Operating System

PCR

Program Clock References

STB

Set-top-box

PC

Personal Computer

PVR

Personal Video Recorder

GPS

Global Positioning System

ECM

Entitlement Control Message

EMM

Entitlement Management Message

ROI

Region Of Interest

PIN

Personal identification number

DSM-CC

Digital Storage Media - Command and Control Protocol

RTP

Real-time Transport Protocol

UMID

Unique Material Identifier

MHEG

Hypermedia information coding Expert Group

XML

eXtensible Markup Language

Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
Definition statement

This place covers:

Subject matter comprising methods and components in the main broadcast server, headend, video-on-demand server, or server associated with the headend/video-on-demand server, which includes services, management and operations performed on the bitstream for distribution to client devices or an intermediate component over a network. The server adapts the content received from the content provider to the distribution network and only provides a network interface. Addressing issues and the exchange of control signals with the clients or the network are placed by definition in the T-model. The first layer of this subgroup pertains to the physical description of the server, e.g. its internal components, the sources of the content. The server may consist of a single physical entity or of a plurality of interconnected sub-servers. The second layer is directed to elementary specialized functions such as the storage and retrieval of the content, the processing of the elementary multimedia streams, the multiplexing thereof, the insertion of additional data, the processing of the data at the downstream and upstream network interfaces (e.g. channel coding, network adaptation, handling of clients requests), the monitoring of internal processes, e.g. server load, or of network interfaces, e.g. downstream bandwidth. The third layer describes the management of the content and of the system, such as client device or user management, scheduling issues e.g. according to bandwidth or billing policies, creation of virtual channels, management of services not directly linked to the distribution of multimedia content, e.g. billing, shopping, rights. The last layer is directed to data services directly accessible by the user, such as hosting of private data. The subgroup is directed to documents related to the insertion of server related data into a signal, such as time information inserted into EPG information. Raw multimedia data per se, is placed in H04N 21/80. The subgroup is directed to documents related to server functions, such as transmitting data to the user however, server characteristics initiated or performed on behalf of a user request is placed in the H04N 21/60. Examples of documents placed in the S-model (1) This subgroup is directed towards a server, which could be the source of additional information related to the World Wide Web. (2) This subgroup is directed towards alteration of the scene composition in regards to video objects (e.g. MPEG-2 or MPEG-4 objects). (3) This subgroup is directed towards multiplexing of video and audio streams for transmission. (4) This subgroup is directed towards the distribution of video data throughout a dwelling where the user is unaware of other users (e.g. a Hotel, Airplane or Train). Systems that provide video distribution within a dwelling where the user is aware of other users (e.g. a home gateway) is classified elsewhere. (5) This subgroup is directed to local storage built into (or next to) the server (H04N 21/218) and placement of the data onto the local storage device (H04N 21/231) (note: this is typically used in a VOD environment). Systems which are concerned about the specific details of storage or recording of video data, where the claimed invention is directed to how the video is stored or recorded (e.g. placement of the recording heads within a local storage device on a server), are classified elsewhere. (6) The subgroup is directed to documents related to the insertion of server related data into a signal, such as time information inserted into EPG information.

References
Limiting references

This place does not cover:

Streaming audio/video via internet

H04N 21/6125

Generation of the timestamps used for synchronization purposes

H04N 21/8547

URLs sent in the video signal

H04N 21/8586

Server components or server architectures
Definition statement

This place covers:

Physical description of the multimedia server. As most of the components are always present (e.g. modulator, memory), a symbol should be allocated only if one of the component has a critical function in the invention. It should be further noted, that most of the components have already an entry in other technical fields and that for example the circuitry of a modulator is not part of this model. The server is used to distribute the content in a very limited geographical area, such as a single building. It is localized in the same building. It can be for example a hotel or hospital. The server and clients are localized in a movable object, such as an aircraft, a train or a bus.

Specialised server platform, e.g. server located in an airplane, hotel, hospital {(arrangements specially adapted for local area broadcast systems H04H 20/61)}
Definition statement

This place covers:

Servers been specially adapted to systems located in a confined environment.

References
Limiting references

This place does not cover:

Arrangements specially adapted for local area broadcast systems

H04H 20/61

{located in a single building, e.g. hotel, hospital or museum (arrangements specially adapted for plural spots in a confined site in broadcast systems H04H 20/63; adaptations for transmission by electric cable for domestic distribution in television systems H04N 7/106)}
Definition statement

This place covers:

The server is used to distribute the content in a very limited geographical area, such as a single building. It is localized in the same building. It can be for example a hotel, multiple dwelling units, hospital or museum, movie theater if serving different projection rooms.

References
Limiting references

This place does not cover:

Adaptations for transmission by electric cable for domestic distribution in television systems

H04N 7/106

Arrangements specially adapted for plural spots in a confined site in broadcast systems

H04H 20/63

{located in mass transportation means, e.g. aircraft, train or bus (flight-deck installations for entertainment or communications B64D 11/0015; arrangements specially adapted for transportation systems in broadcast systems H04H 20/62; moving wireless networks H04W 84/005)}
Definition statement

This place covers:

Server and clients are localized in a movable object, such as an aircraft, a train or a bus.

References
Limiting references

This place does not cover:

Flight-deck installations for entertainment or communications

B64D 11/0015

Arrangements specially adapted for transportation systems in broadcast systems

H04H 20/62

Moving wireless networks

H04W 84/005

Source of audio or video content {, e.g. local disk arrays (details of retrieval in video databases G06F 16/739)}
Definition statement

This place covers:

The source, from which the multimedia server accesses the multimedia content.

References
Limiting references

This place does not cover:

Details of retrieval in video databases

G06F 16/739

{enabling multiple viewpoints, e.g. using a plurality of cameras}
Definition statement

This place covers:

  • The same scene shot by different cameras under different angles.
  • Panoramic video.
{comprising remotely distributed storage units, e.g. when movies are replicated over a plurality of video servers (distributed storage of data in networks, e.g. transport arrangements for network file system [NFS], storage area networks [SAN] or network attached storage [NAS], H04L 67/1097)}
Definition statement

This place covers:

The source located remotely, like in other video servers, when all available movies are distributed over a plurality of video servers of same importance.

Example(s) of documents found in this subgroup: WO0158163

References
Limiting references

This place does not cover:

Distributed storage of data in networks, e.g. transport arrangements for network file system [NFS], storage area networks [SAN] or network attached storage [NAS]

H04L 67/1097

Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems involving a hierarchy between servers

H04N 21/222

{comprising local storage units}
Definition statement

This place covers:

The video source is built into the server or next to it. It is typical for a VOD server.

{involving memory arrays, e.g. RAID disk arrays (RAID arrays per se G06F 3/0689; use of parity to protect data in RAID systems G06F 11/1008)}
Definition statement

This place covers:

Videos stored on disk arrays.

References
Limiting references

This place does not cover:

RAID arrays per se

G06F 3/0689

Use of parity to protect data in RAID systems

G06F 11/1008

{involving removable storage units, e.g. tertiary storage such as magnetic tapes or optical disks}
Definition statement

This place covers:

Videos retrieved from magnetic or optical tapes.

Cache memory {(caches in web servers or browsers G06F 16/9574; intermediate storage and caching in data networks H04L 67/568)}
Definition statement

This place covers:

Physical aspects of the cache.

Example(s) of documents found in this subgroup: EP1315091

References
Limiting references

This place does not cover:

Caches in web servers or browsers

G06F 16/9574

Intermediate storage and caching in data networks

H04L 67/568

Informative references

Attention is drawn to the following places, which may be of interest for search:

Caching operation on the server side

H04N 21/23106

Live feed
Definition statement

This place covers:

Live feeds from cameras, or satellite at a headend.

Secondary servers, e.g. proxy server, cable television Head-end {(provisioning of proxy services in data packet switching networks H04L 67/56)}
Definition statement

This place covers:

  • Local servers for serving mobile terminals.
  • The concept of secondary server is used to describe a hierarchy among several servers, as for example in distributed systems.
References
Limiting references

This place does not cover:

Provisioning of proxy services in data packet switching networks

H04L 67/56

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Secondary server

A server belonging to a hierarchy of servers, e.g. servers forming part of distributed systems or local servers for serving mobile terminals

{being a cable television head-end (CATV in broadcast systems H04H 20/78)}
Definition statement

This place covers:

Local server in a broadcast system.

References
Limiting references

This place does not cover:

CATV in broadcast systems

H04H 20/78

{being a public access point, e.g. for downloading to or uploading from clients (arrangements specially adapted to plural spots in a confined site in broadcast systems H04H 20/63)}
Definition statement

This place covers:

Public access point, where content can be downloaded to / uploaded from clients.

References
Limiting references

This place does not cover:

Arrangements specially adapted to plural spots in a confined site in broadcast systems

H04H 20/63

Local VOD servers
Definition statement

This place covers:

Local VOD server to serve a small area.

{Server identification by a unique number or address, e.g. serial number (network arrangements, protocols or services for addressing or naming H04L 61/00)}
Definition statement

This place covers:

Identification number of the server. It can be used for authenticating the server.

References
Limiting references

This place does not cover:

Network arrangements in data packet switching networks, protocols or services for addressing or naming

H04L 61/00

Processing of content or additional data; Elementary server operations; Server middleware
Definition statement

This place covers:

Elementary specialized functions. They can be implemented in software or hardware. Their task is to control the corresponding hardware component and to provide a service to the upper layer, e.g. network synchronization using a master clock for downstream/upstream transmissions. Synchronization of transmitters.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Handling or recovery of errors occurring in the server

H04N 21/2404

Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
Definition statement

This place covers:

Organization and the action of storage as well as writing actions. Storage can be performed in disk arrays as found in VOD servers as well as internal databases, caching of movies or data or any memory related problem.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Retrieving and reading data in the server

H04N 21/232

Server-side memory management

H04N 21/241

{using load balancing strategies, e.g. by placing or distributing content on different disks, different memories or different servers (storage management G06F 3/0604; allocation of resources considering the load in multiprogramming arrangements G06F 9/505; techniques for rebalancing the load in a distributed system G06F 9/5083; access to distributed or replicated servers, e.g. load balancing, in data networks H04L 67/1001)}
Definition statement

This place covers:

Methods describing the placement or distribution of content on different disks or different servers with the aim of providing a balanced load within the (distributed) system.

Example(s) of documents found in this subgroup: US 2004/0202444 A1

References
Limiting references

This place does not cover:

Storage management

G06F 3/0604

Allocation of resources considering the load in multiprogramming arrangements

G06F 9/505

Techniques for rebalancing the load in a distributed system

G06F 9/5083

Access to distributed or replicated servers, e.g. load balancing, in data networks

H04L 67/1001

Informative references

Attention is drawn to the following places, which may be of interest for search:

Data replication on different disks or servers

H04N 21/23116

{involving caching operations (prefetching while addressing of a memory level in which the access to the desired data or data block requires associative addressing means within memory systems or architectures G06F 12/0862; caching at an intermediate stage in a data network H04L 67/568)}
Definition statement

This place covers:

Caching action, for example of movies in a local VOD server. The storage has a temporary aspect and must be distinguished from buffering as performed in the video encoder which holds the multimedia data for a brief period of time.

Example(s) of documents found in this subgroup: US 2002/0169926 A1

References
Limiting references

This place does not cover:

Buffering on the encoder side

H04N 21/23406

Prefetching while addressing of a memory level in which the access to the desired data or data block requires associative addressing means within memory systems or architectures

G06F 12/0862

Caching at an intermediate stage in a data network

H04L 67/568

{by placing content in organized collections, e.g. EPG data repository (details of retrieval of video data and associated meta data in video databases G06F 16/739)}
Definition statement

This place covers:

Details of the generation and the management of a local database because it is trivial that local data are always stored in some kind of database (from simple lists to complex structures).

References
Limiting references

This place does not cover:

Details of retrieval of video data and associated meta data in video databases

G06F 16/739

{involving housekeeping operations for stored content, e.g. prioritizing content for deletion because of storage space restrictions (storage management, e.g. defragmentation G06F 3/0604; snloading stored programs G06F 9/445; housekeeping operations in file systems, e.g. deletion policies G06F 16/10; buffering arrangements in a network node or in an end terminal in packet networks H04L 49/90)}
Definition statement

This place covers:

Algorithms, describing which data are prioritized for deletion (e.g. oldest or less used data) are classified here.

References
Limiting references

This place does not cover:

Storage management, e.g. defragmentation

G06F 3/0604

Unloading stored programs

G06F 9/445

Housekeeping operations in file systems, e.g. deletion policies

G06F 16/10

Buffering arrangements in a network node or in an end terminal in packet networks

H04L 49/90

{involving data replication, e.g. over plural servers (synchronization of replicated data G06F 11/1658; error detection or correction by means of data replication G06F 11/2053; replication in distributed file systems G06F 16/10; replication in distributed file systems G06F 16/27; replication or mirroring of data in data networks H04L 67/1095)}
Definition statement

This place covers:

Content replicated over different servers or over different hard disks.

References
Limiting references

This place does not cover:

Synchronization of replicated data

G06F 11/1658

Error detection or correction by means of data replication

G06F 11/2053

Replication in distributed file systems

G06F 16/10

Replication in distributed file systems

G06F 16/27

Replication or mirroring of data in data networks

H04L 67/1095

Data placement on disk arrays {(data placement in general G06F 3/0604)}
Definition statement

This place covers:

Data block placement strategies in the disk array of video servers.

References
Limiting references

This place does not cover:

Data placement in general

G06F 3/0604

using interleaving
Definition statement

This place covers:

  • Successive file blocks stored on different disks.
  • A whole sector localized on one disk only.
using striping
Definition statement

This place covers:

A data sector distributed over several disks (RAID technology).

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

RAID

Redundant array of independent disks. A storage technology that combines multiple disk drive components into a logical unit

Content retrieval operation {locally} within server, e.g. reading video streams from disk arrays {(storage management G06F 3/0604; details of querying and searching of video data from a database G06F 16/739)}
Definition statement

This place covers:

Operations linked to the retrieval of the multimedia stream from the disks, e.g. disk scheduling and file mapping.

References
Limiting references

This place does not cover:

Storage management

G06F 3/0604

Details of querying and searching of video data from a database

G06F 16/739

Informative references

Attention is drawn to the following places, which may be of interest for search:

Content storage operation

H04N 21/231

Processing of audio elementary streams {(monitoring, identification or recognition of audio in broadcast systems H04H 60/58)}
References
Limiting references

This place does not cover:

Monitoring, identification or recognition of audio in broadcast systems

H04H 60/58

Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of formatting and decoding of an encoded audio signal representation into a data stream for transmission or storage purposes:

G10L 19/167

{involving reformatting operations of audio signals, e.g. by converting from one coding standard to another (details of audio signal transcoding G10L 19/173)}
Definition statement

This place covers:

Reformatted audio stream, e.g. by converting from one coding standard to another.

References
Limiting references

This place does not cover:

Details of audio signal transcoding

G10L 19/173

Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
Definition statement

This place covers:

  • Video stream management.
  • The control of the encoder, video scaling and transcoding aspects, synchronization, interactive control of playback, composition of MPEG-4 objects or embedding of graphics or text.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Video encoding or transcoding processes per se

H04N 19/00

Involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level

H04N 21/23892

Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to MPEG-4 scene graphs

H04N 21/44

{involving management of server-side video buffer}
Definition statement

This place covers:

Buffer level control.

{for generating or manipulating the scene composition of objects, e.g. MPEG-4 objects}
Definition statement

This place covers:

Spatial composition of MPEG-4 objects at the program generation using a scene graph.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Scene rendering using a scene graph

H04N 21/44012

{involving operations for analysing video streams, e.g. detecting features or characteristics (television picture signal circuitry for scene change detection H04N 5/147; filtering for image enhancement G06T 5/00; methods or arrangements for recognising scenes G06V 20/00; arrangements characterised by components specially adapted for monitoring, identification or recognition of video in broadcast systems H04H 60/59)}
Definition statement

This place covers:

Detection of features (e.g. logo) in a video stream, extraction of characteristics directly from the video stream.

References
Limiting references

This place does not cover:

Television picture signal circuitry for Scene change detection

H04N 5/147

Filtering for image enhancement

G06T 5/00

Methods or arrangements for recognising scenes

G06V 20/00

Arrangements characterised by components specially adapted for monitoring,identification or recognition of video in broadcast systems

H04H 60/59

Informative references

Attention is drawn to the following places, which may be of interest for search:

Image analysis per se

G06T 7/00

{involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement}
Definition statement

This place covers:

Splicing of at least one video stream with another stream (video or not) at the server level. It can be used for inserting or substituting a piece of video such as a commercial.

involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements {(video transcoding H04N 19/40; media packet handling at the source H04L 65/762)}
Definition statement

This place covers:

The original A/V stream received from the content provider is reformatted.The output format is defined here.

Example(s) of documents found in this subgroup: US 2008/001791 A1

References
Limiting references

This place does not cover:

Video transcoding

H04N 19/40

Media packet handling at the source in data packet switching networks

H04L 65/762

Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of conversion of video standards at pixel level

H04N 7/01

{by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo (conversion of standards in analog television systems H04N 7/01)}
Definition statement

This place covers:

Transcoding between standards (e.g. MPEG-2 to MPEG-4) or between format such as Quicktime to Realvideo.

References
Limiting references

This place does not cover:

Conversion of standards in analog television systems

H04N 7/01

{by decomposing into objects, e.g. MPEG-4 objects}
Definition statement

This place covers:

The components have been coded according to MPEG-4 and become objects.

{by decomposing into layers, e.g. base layer and one or more enhancement layers}
Definition statement

This place covers:

  • Content divided in layers, e.g. base layer and one or more enhancement layers.
  • Multiple Description Coding [MDC].
{by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text}
Definition statement

This place covers:

  • Transcoding between modalities, e.g. audio to text.
  • Slideshow of still pictures transformed in a video.
{the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment}
Definition statement

This place covers:

The reformatting operation is performed on part of the stream, the part being spatial region of the image or a time segment.

{by altering signal-to-noise ratio parameters, e.g. requantization}
Definition statement

This place covers:

  • New quantization parameters are introduced allowing to change the resolution of each video frame.
  • Degradation of the signal by addition of noise.
{by altering the spatial resolution, e.g. for clients with a lower screen resolution}
Definition statement

This place covers:

The server provides a video with a spatial resolution commensurate with, e.g. the display capabilities of the client

{for performing aspect ratio conversion}
Definition statement

This place covers:

Server reformats video to alter aspect ratio, e.g. between 4:3 and 16:9.

{for generating different versions}
Definition statement

This place covers:

Different versions of the same audio/video stream are created and stored for later immediate retrieval.

involving video stream encryption
Definition statement

This place covers:

  • Scrambling of the video stream, encryption of the content stream.
  • Scrambling of multimedia content in general.

Example(s) of documents found in this subgroup: US 2002/0085734 A1

References
Limiting references

This place does not cover:

Multiplex stream encryption in the server

H04N 21/23895

Arrangements using cryptography for the use of broadcast information or broadcast-related information

H04H 60/23

Informative references

Attention is drawn to the following places, which may be of interest for search:

Analogue secrecy systems

H04N 7/16

Arrangements for secret or secure communication

H04L 9/00

Arrangements for preventing the taking of data from a data transmission channel without authorisation

H04L 12/22

Security arrangements in wireless networks

H04W 12/00

{by pre-encrypting}
Definition statement

This place covers:

Covering encryption of content before storage in a (VOD) server, also known as off-line encryption.

{by partially encrypting, e.g. encrypting the ending portion of a movie}
Definition statement

This place covers:

Not all of the signal is scrambled or different parts are encrypted differently, e.g. to reduce processor load or to enable a reduced quality presentation.

Processing of additional data, e.g. scrambling of additional data or processing content descriptors
Definition statement

This place covers:

Insertion of software modules and additional data in the video stream. The specific nature of the additional data is not considered.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Calculation of the repetition rate and of the timing of insertion of additional data by the server-side scheduler

H04N 21/262

Processing of additional data on the client side

H04N 21/435

Arrangements for simultaneous

H04H 20/28

{involving encryption of additional data (arrangements using cryptography for the use of broadcast information or broadcast-related information H04H 60/23)}
References
Limiting references

This place does not cover:

Arrangements using cryptography for the use of broadcast information or broadcast-related information

H04H 60/23

{specifically adapted to content descriptors, e.g. coding, compressing or processing of metadata}
Definition statement

This place covers:

Coding/compression or more generally modification of additional data associated with the content.

{involving reformatting operations of additional data, e.g. HTML pages (optimising the visualization of content for information retrieval from the Internet G06F 16/9577; tracking of instant messages H04L 51/234; media packet handling at the source H04L 65/762)}
Definition statement

This place covers:

Additional informations such as an HTML page are reformatted by the server. Translation in a different language.

Example(s) of documents found in this subgroup: WO 02/071264 A2

References
Limiting references

This place does not cover:

Optimising the visualization of content for information retrieval from the Internet

G06F 16/9577

Tracking of instant messages

H04L 51/234

Media packet handling at the source in data packet switching networks

H04L 65/762

{by altering the spatial resolution}
Definition statement

This place covers:

Modified resolution of the additional information. It can be used, e.g. to reformat additional data for different destination client devices.

{for generating different versions, e.g. for different recipient devices}
Definition statement

This place covers:

The server generates at least one other version of the original additional data, which is available together with the original version.

Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream {(multiplexing of data packets for data networks, e.g. RTP/UDP H04L 65/00)}
Definition statement

This place covers:

Transport stream generation. Takes as input video or audio streams or already multiplexed AV stream (remultiplexing) and outputs a single Transport Stream.

References
Limiting references

This place does not cover:

Multiplexing of data packets for data networks

H04L 65/00

{Remultiplexing multiplex streams, e.g. involving modifying time stamps or remapping the packet identifiers}
Definition statement

This place covers:

Modification of bitstream parameters, e.g. restamping, transmultiplexing, remapping of PIDs.

{Insertion of stuffing data into a multiplex stream, e.g. to obtain a constant bitrate (synchronisation arrangements in time-division multiplex systems using bit stuffing for systems with different or fluctuating information rates H04J 3/073)}
Definition statement

This place covers:

Insertion of stuffing bits/bytes/packets in the packetised stream to e.g. obtain a constant bitrate.

References
Limiting references

This place does not cover:

Synchronisation arrangements in time-division multiplex systems using bit stuffing for systems with different or fluctuating information rates

H04J 3/07

{Multiplexing of additional data and video streams (arrangements for simultaneous broadcast of plural pieces of information H04H 20/28)}
Definition statement

This place covers:

Multiplexing in an MPEG stream according to the DVB standard or generally speaking, insertion of additional data in the streaming of a digital TV system.

References
Limiting references

This place does not cover:

Arrangements for simultaneous broadcast of plural pieces of information

H04H 20/28

{by inserting additional data into a data carousel, e.g. inserting software modules into a DVB carousel (arrangements for broadcast or for distribution of identical information repeatedly in broadcast distribution systems H04H 20/16)}
Definition statement

This place covers:

Insertion in a DVB carousel.

References
Limiting references

This place does not cover:

Arrangements for broadcast or for distribution of identical information repeatedly in broadcast distribution systems

H04H 20/16

Generation or processing of Service Information [SI]
Definition statement

This place covers:

Generation of MPEG SI and PSI tables.

{Statistical multiplexing, e.g. by controlling the encoder to alter its bitrate to optimize the bandwidth utilization}
Definition statement

This place covers:

The typical structure of a stat mux is a multiplexer which sends command signals back to the video coder(s) to make them change parameters (e.g. bitrate) so as to optimise the global use of the bandwidth.

Multiplexing of audio and video streams
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Generation of timestamps for synchronization purposes

H04N 21/8547

Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams {(hybrid fiber coaxial [HFC] networks for downstream channel allocation for video distribution H04L 12/2801; flow control in packet networks H04L 47/10; real-time communication protocols in data switching networks H04L 65/00; scheduling or organising the servicing of application requests H04L 67/60)}
Definition statement

This place covers:

Processing the transport stream after its assembly and sending it over the network.

References
Limiting references

This place does not cover:

Hybrid Fiber Coaxial [HFC] networks for downstream channel allocation for video distribution

H04L 12/2801

Flow control in packet networks

H04L 47/10

Real-time communication protocols in data switching networks

H04L 65/00

Scheduling or organising the servicing of application requests in data packet switching networks

H04L 67/60

{Controlling the feeding rate to the network, e.g. by controlling the video pump}
Definition statement

This place covers:

The video pump is responsible for feeding the program content to the network at the correct data rate, for example after having received a control signal from the network.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Video streams retrieval

H04N 21/232

Adapting the multiplex stream to a specific network, e.g. an Internet Protocol [IP] network {(transmission of MPEG streams over ATM H04L 12/5601)}
Definition statement

This place covers:

Bitstream adapted to a specific network. The type of network or protocol used is classified elsewhere.

References
Limiting references

This place does not cover:

Transmission of MPEG streams over ATM

H04L 12/5601

Informative references

Attention is drawn to the following places, which may be of interest for search:

Adapting the video stream to a specific local network, e.g. a Bluetooth® network

H04N 21/4363

Channel coding {or modulation} of digital bit-stream, e.g. QPSK modulation (arrangements for detecting or preventing errors in the information received by adapting the channel coding H04L 1/0009; analogue front ends or means for connecting modulators, demodulators or transceivers to a transmission line H04L 27/0002)
Definition statement

This place covers:

Protection of the digital bitstream (e.g. RS coding) and modulation.

References
Limiting references

This place does not cover:

Arrangements for detecting or preventing errors in the information received by adapting the channel coding

H04L 1/0009

Analog front ends or means for connecting modulators, demodulators or transceivers to a transmission line

H04L 27/0002

Channel allocation (H04N 21/266 takes precedence); Bandwidth allocation (H04N 21/24 takes precedence {; allocation of channels according to the instantaneous demands of the users in time-division multiplex systems H04J 3/1682; admission control, resource allocation in open networks H04L 12/5692; arrangements for maintenance or administration in data switching networks involving bandwidth and capacity management H04L 41/0896; negotiating bandwidth in wireless networks H04W 28/16})
Definition statement

This place covers:

Channel and bandwidth allocation.

Example(s) of documents found in this subgroup: WO 03/088667 A1

References
Limiting references

This place does not cover:

Allocation of channels according to the instantaneous demands of the users in time-division multiplex systems

H04J 3/1682

Admission control, resource allocation in open networks

H04L 12/5692

Arrangements for maintenance or administration in data switching networks involving bandwidth and capacity management

H04L 41/0896

Negotiating bandwidth in wireless networks

H04W 28/16

Stream processing in response to a playback request from an end-user, e.g. for trick-play
Definition statement

This place covers:

Management of the video stream after receiving an upstream playback control signal from the client, for example in a VOD system to pause or ffwd the video stream.

Multiplex stream processing, e.g. multiplex stream encrypting
Definition statement

This place covers:

Processing of the transport stream as received from the network and before being adapted to the delivery medium.

{involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level}
Definition statement

This place covers:

  • Embedding of data in a piece of content, for example picture, text in a video.
  • The operations performed by a content provider at a workstation to create an interactive multimedia presentation.
{involving multiplex stream encryption}
Definition statement

This place covers:

Only the descrambling/decrypting of the transport stream is described here. The descrambling/decrypting of the video stream is described elsewhere.

Interfacing the upstream path of the transmission network, e.g. prioritizing client {content} requests (hybrid fiber coaxial [HFC] networks for upstream channel allocation for video distribution H04L 12/2801; flow control in data networks H04L 47/10; real-time communication protocols in data switching networks H04L 65/00; scheduling or organising the servicing of application requests H04L 67/60)
Definition statement

This place covers:

This interface manages the uplink signals coming from all the clients and is used for example to handle requests (e.g. requests for a particular multimedia service).

References
Limiting references

This place does not cover:

Hybrid Fiber Coaxial [HFC] networks for upstream channel allocation for video distribution

H04L 12/2801

Flow control in data networks

H04L 47/10

Real-time communication protocols in data switching networks

H04L 65/00

Scheduling or organising the servicing of application requests in data packet switching networks

H04L 67/60

{involving handling client requests (scheduling or organising the servicing of application requests H04L 67/60)}
References
Limiting references

This place does not cover:

Scheduling or organising the servicing of application requests in data packet switching networks

H04L 67/60

{characterized by admission policies (admission control, resource allocation in open networks H04L 12/5692; arrangements for network security using user profiles for access control H04L 63/102; access security in wireless networks H04W 12/08)}
Definition statement

This place covers:

Admission policies of clients in video servers.

References
Limiting references

This place does not cover:

Admission control, resource allocation in open networks

H04L 12/5692

Arrangements for network security using user profiles for access control

H04L 63/102

Access security in wireless networks

H04W 12/08

Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests {(monitoring of server performance or load G06F 11/34; arrangements for observation, testing or troubleshooting for broadcast or for distribution combined with broadcast H04H 20/12)}
Definition statement

This place covers:

Monitoring is an internal process, which checks permanently user requests, the bandwidth available at the different network interfaces or any internal processes. It can generate reports of system usage.

References
Limiting references

This place does not cover:

Monitoring of server performance or load

G06F 11/34

Monitoring or testing of transmitters in general

H04B 17/10

Arrangements for observation, testing or troubleshooting for broadcast or for distribution combined with broadcast

H04H 20/12

{Monitoring of the client buffer}
Definition statement

This place covers:

The server monitors the client buffer.

{Monitoring of the downstream path of the transmission network, e.g. bandwidth available (traffic monitoring in data switching networks H04L 43/00; monitoring data switching networks utilization H04L 43/0876)}
Definition statement

This place covers:

Monitoring of the available bandwidth or bit rate.

References
Limiting references

This place does not cover:

Traffic monitoring in data switching networks

H04L 43/00

Monitoring data switching networks utilization

H04L 43/0876

{Monitoring of server processing errors or hardware failure (error or fault detection G06F 11/07; monitoring in general G06F 11/30)}
Definition statement

This place covers:

Detection of an error during content distribution, content loading, multiplex management, hardware failure.

References
Limiting references

This place does not cover:

Error or fault detection

G06F 11/07

Monitoring in general

G06F 11/30

{Monitoring of the internal components or processes of the server, e.g. server load (allocation of resources in multiprogramming arrangements G06F 9/50; performance measurement of computer activity G06F 11/34)}
Definition statement

This place covers:

The load or processing capabilities of the server are monitored.

References
Limiting references

This place does not cover:

Allocation of resources in multiprogramming arrangements

G06F 9/50

Performance measurement of computer activity

G06F 11/34

{Monitoring of transmitted content, e.g. distribution time, number of downloads (arrangements for monitoring programmes for broadcast or for distribution combined with broadcast H04H 20/14)}
Definition statement

This place covers:

Monitoring of aired content for logging and verification purposes. It can be sent to a rights server or an advertiser for billing. Includes the number of times content has been downloaded (not requested, which is classified elsewhere).

References
Limiting references

This place does not cover:

Arrangements for monitoring programmes for broadcast or for distribution combined with broadcast

H04H 20/14

{Monitoring of the upstream path of the transmission network, e.g. client requests (monitoring data switching networks utilization H04L 43/0876; scheduling or organising the servicing of application requests H04L 67/60)}
Definition statement

This place covers:

  • Requests from clients received at the upstream interface are monitored.
  • Includes log files of client requests.
References
Limiting references

This place does not cover:

Monitoring data switching networks utilization

H04L 43/0876

Scheduling or organising the servicing of application requests

H04L 67/60

Operating system [OS] processes, e.g. server setup
Definition statement

This place covers:

Basic functions provided by the operating system like memory management, event handling, multitasking, multithreading, setup.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

OS processes, e.g. booting a STB, implementing a Java virtual machine in a STB or power management in a STB

H04N 21/443

Program loading or initiating in general

G06F 9/445

Multiprogramming arrangements

G06F 9/46

Synchronization processes, e.g. processing of PCR [Program Clock References] {(arrangements for synchronising broadcast or distribution via plural systems in broadcast distribution systems H04H 20/18)}
Definition statement

This place covers:

Synchronization issues.

References
Limiting references

This place does not cover:

Arrangements for synchronising broadcast or distribution via plural systems in broadcast distribution systems

H04H 20/18

Informative references

Attention is drawn to the following places, which may be of interest for search:

Synchronising circuits with arrangements for extending range of synchronisation at the transmitter end

H04N 5/067

Synchronisation arrangements in time-division multiplex systems

H04J 3/06

Arrangements for synchronising receiver with transmitter

H04L 7/00

Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies {(maintenance or administration in data networks H04L 41/00)}
Definition statement

This place covers:

Server-side system management

References
Limiting references

This place does not cover:

Maintenance or administration in data networks

H04L 41/00

{Learning process for intelligent management, e.g. learning user preferences for recommending movies (details of learning user preferences for the retrieval of video data in a video database G06F 16/739; computer systems using learning methods G06N 3/08)}
Definition statement

This place covers:

Server-side agents are similar to the agents implemented on the client and perform similar operations.

References
Limiting references

This place does not cover:

Details of learning user preferences for the retrieval of video data in a video database

G06F 16/739

Computer systems using learning methods

G06N 3/08

Informative references

Attention is drawn to the following places, which may be of interest for search:

Learning process for intelligent management

H04N 21/466

{Processing of multiple end-users' preferences to derive collaborative data}
Definition statement

This place covers:

Preference data are processed to determine similarities between users. They can be clustered to have a limited number of groups of viewers. They are used to enrich the profile of one user by adding data from similar users.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Deriving a common profile for several users on the same client, e.g. family profile

H04N 21/4661

Management at additional data server, e.g. shopping server, rights management server {(arrangements for maintenance or administration in data networks H04L 41/00; network services using third party service providers H04L 67/53)}
Definition statement

This place covers:

  • Non-video distribution application.
  • A whole range of services, which do not deal directly with the distribution of multimedia content. They play a crucial part of the associated business model but because of their non-technical nature, they are separated from the other management functions. They can also be provided by a 3rd party.
  • Support/help center, the HLR of a mobile phone network for collecting the position of a mobile client.
References
Limiting references

This place does not cover:

Arrangements for maintenance or administration in data networks

H04L 41/00

network services using third party service providers

H04L 67/53

{Rights Management (protecting software against unauthorised usage in a vending or licensing environment G06F 21/10; security in data switching network management H04L 41/28; security management or policies for network security H04L 63/20; access security in wireless networks H04W 12/08)}
Definition statement

This place covers:

External server specially adapted to perform rights management operations.

References
Limiting references

This place does not cover:

Protecting software against unauthorised usage in a vending or licensing environment

G06F 21/10

Security in data switching network management

H04L 41/28

Security management or policies for network security

H04L 63/20

Access security in wireless networks

H04W 12/08

Informative references

Attention is drawn to the following places, which may be of interest for search:

Client-side monitoring of content usage

H04N 21/44204

Definition of usage data

H04N 21/8355

{for selling goods, e.g. TV shopping (payment schemes, payment architectures or payment protocols for electronic shopping systems G06Q 20/12)}
Definition statement

This place covers:

Shopping and product management aspect. The shopping application is classified elsewhere.

References
Limiting references

This place does not cover:

Payment schemes, payment architectures or payment protocols for electronic shopping systems

G06Q 20/12

Billing {, e.g. for subscription services (payment schemes, architectures or protocols G06Q 20/00; e-commerce G06Q 30/00; arrangements for billing for the use of broadcast information or broadcast-related information H04H 60/21)}
Definition statement

This place covers:

Billing aspects.

References
Limiting references

This place does not cover:

Payment architectures, schemes or protocols

G06Q 20/00

Commerce, e.g. shopping or e-commerce

G06Q 30/00

Arrangements for billing for the use of broadcast information or broadcast-related information

H04H 60/21

Informative references

Attention is drawn to the following places, which may be of interest for search:

Billing systems or methods specially adapted for commercial purposes

G06Q 30/04

Charging arrangements in data networks

H04L 12/14

{involving characteristics of content or additional data, e.g. video resolution or the amount of advertising}
Definition statement

This place covers:

The price depends on the nature of the program offered. It can be also inversely proportional to the amount of commercials inserted.

Third Party Billing, e.g. billing of advertiser
Definition statement

This place covers:

Billing aspects not pertaining to the end-user or subscriber but to a third party such as an advertiser. Billing can be performed according to monitored viewer selections.

Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data {(arrangements for services using the result on the distributing side of broadcast systems H04H 60/66; profiles in network data switching protocols H04L 67/30)}
Definition statement

This place covers:

Customer management. Maintains databases for storing data about the clients it is connected to and their users.

References
Limiting references

This place does not cover:

Arrangements for services using the result on the distributing side of broadcast systems

H04H 60/66

Profiles in network data switching protocols

H04L 67/30

{Management of client data (terminal profiles in network data switching protocols H04L 67/303)}
Definition statement

This place covers:

The management system stores data pertaining to the client device, regardless of its user.

References
Limiting references

This place does not cover:

terminal profiles in network data switching protocols

H04L 67/303

{involving client authentication (restricting access to computer systems by authenticating users using a predetermined code G06F 21/33; cryptographic authentication protocols H04L 9/32; networks authentication protocols H04L 63/08; authentication in wireless network security H04W 12/06)}
Definition statement

This place covers:

The server authenticates the client device.

References
Limiting references

This place does not cover:

Restricting access to computer systems by authenticating users using a predetermined code

G06F 21/33

Cryptographic authentication protocols

H04L 9/32

Network authentication protocols

H04L 63/08

Authentication in wireless network security

H04W 12/06

{involving client display capabilities, e.g. screen resolution of a mobile phone (optimising the visualisation of content during browsing in the Internet G06F 16/9577; processing of terminal status or physical abilities in wireless networks H04W 8/22; authentication in wireless network security H04W 12/06)}
Definition statement

This place covers:

Clients may be diverse by nature and have different display capabilities, e.g. TV, PC, mobile phone or PDA.

References
Limiting references

This place does not cover:

Optimising the visualisation of content during browsing in the Internet

G06F 16/9577

Processing of terminal status or physical abilities in wireless networks

H04W 8/22

Authentication in wireless network security

H04W 12/06

Informative references

Attention is drawn to the following places, which may be of interest for search:

Reformatting of the video stream by the server, e.g. based on client parameters

H04N 21/2343

{involving client hardware characteristics, e.g. manufacturer, processing or storage capabilities (allocation of resources considering hardware capabilities in multiprogramming arrangements G06F 9/5044; allocation of resources considering software capabilities in multiprogramming arrangements G06F 9/5055)}
Definition statement

This place covers:

A hardware profile contains a client ID, a STB manufacturer, model, general processing and memory/storage capabilities, except for display.

References
Limiting references

This place does not cover:

Allocation of resources considering hardware capabilities in multiprogramming arrangements

G06F 9/5044

Allocation of resources considering software capabilities in multiprogramming arrangements

G06F 9/5055

{involving the geographical location of the client (retrieval from the Internet by querying based on geographical locations G06F 16/9537; arrangements for identifying locations of receiving stations in broadcast systems H04H 60/51; location of the user terminal in data switching networks H04L 67/52; services making use of the location of users or terminals in wireless networks H04W 4/02; locating users or terminals in wireless networks H04W 64/00)}
Definition statement

This place covers:

The server determines or is aware of the location of the client device. The determination can be performed by retrieving data from a HLR in a mobile phone network or by triangulation methods.

References
Limiting references

This place does not cover:

Retrieval from the Internet by querying based on geographical locations

G06F 16/9537

Arrangements for identifying locations of receiving stations in broadcast systems

H04H 60/51

Location of the user terminal in data switching networks

H04L 67/52

Services making use of the location of users or terminals in wireless networks

H04W 4/02

Locating users or terminals in wireless networks

H04W 64/00

Special rules of classification

This group must be distinguished from user demographical data, classified elsewhere. This group is typically used for targeting location dependent programs or additional information.

{Generation of a revocation list, e.g. of client devices involved in piracy acts}
Definition statement

This place covers:

The server keeps a list of client devices, which have been reported to been involoved in piracy acts, such as falsifying the decryption card.

{involving client software characteristics, e.g. OS identifier}
Definition statement

This place covers:

The software profile contains a record of the type of software installed on the client, including version number for automatic upgrades.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of operating systems in clients

H04N 21/443

Executable data, e.g. software

H04N 21/8166

{Management of end-user data (customer care in data networks H04L 41/5077)}
Definition statement

This place covers:

The management system stores data related to its users regardless of the client device they use.

References
Limiting references

This place does not cover:

Customer care in data networks

H04L 41/5077

{involving end-user authentication (restricting access to computer systems by authenticating users using a predetermined code G06F 21/33; arrangements for secret or secure communication including means for verifying the identity or authority of a user of the system H04L 9/32; networks authentication protocols H04L 63/08; authentication in wireless network security H04W 12/06)}
Definition statement

This place covers:

Storage of physical characteristics of the user (e.g. fingerprint). The server authenticates the user of the client device.

References
Limiting references

This place does not cover:

Restricting access to computer systems by authenticating users using a predetermined code

G06F 21/33

Arrangements for secret or secure communication including means for verifying the identity or authority of a user of the system

H04L 9/32

Network authentication protocols

H04L 63/08

Authentication in wireless network security

H04W 12/06

{being end-user demographical data, e.g. age, family status or address (arrangements for identifying locations of users in broadcast systems H04H 60/52)}
Definition statement

This place covers:

When the user registers for the 1st time, he provides demographical data such as his gender, age, family status, profession, adress and ZIP code. Covers general interests but not viewing interests.

References
Limiting references

This place does not cover:

Arrangements for identifying locations of users in broadcast systems

H04H 60/52

{being end-user preferences (retrieval of video data in a video database based on user preferences G06F 16/739; arrangements for recognizing users' preferences H04H 60/46; user profiles in network data switching protocols H04L 67/306; processing of user preferences or user profiles in wireless networks H04W 8/18)}
Definition statement

This place covers:

Preferences may be derived from viewing history of the user and have been collected dynamically. Preferences can also be collected at user registration by providing general interests.

References
Limiting references

This place does not cover:

Retrieval of video data in a video database based on user preferences

G06F 16/735

Arrangements for recognizing users' preferences

H04H 60/46

User profiles in network data switching protocols

H04L 67/306

Processing of user preferences or user profiles in wireless networks

H04W 8/18

Informative references

Attention is drawn to the following places, which may be of interest for search:

Client-side monitoring of end-user

H04N 21/44213

Uploading data stored on the client to server

H04N 21/658

Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists {(scheduling strategies for dispatcher in multiprogramming arrangements G06F 9/4881; arrangements for scheduling broadcast services or broadcast-related services H04H 60/06; flow control in packet networks H04L 47/10; establishing a time schedule or organising the servicing of application requests H04L 67/62)}
Definition statement

This place covers:

The function of the scheduler is to plan the distribution of the multimedia content over time. It must guarantee, that the client can access the content, when it is supposed to. The scheduler considers a number of constraints, like different available bandwidths for example at day or night, or higher priorities if a user has paid a higher fee or the best timing for inserting a commercial (prime time). It has also to perform location resolution tasks, like for example assigning a time and channel to a TV program.

References
Limiting references

This place does not cover:

Scheduling strategies for dispatcher in multiprogramming arrangements

G06F 9/4881

Arrangements for scheduling broadcast services or broadcast-related services

H04H 60/06

Flow control in packet networks

H04L 47/10

Establishing a time schedule or organising the servicing of application requests in data packet switching network

H04L 67/62

{the scheduling operation being performed under constraints}
Definition statement

This place covers:

The scheduling algorithm performs optimization operations under constraints recevied as input data.

{involving the channel capacity, e.g. network bandwidth (admission control, resource allocation in open networks H04L 12/5692; flow control in packet networks H04L 47/10; establishing a schedule or organising the servicing of application requests taking into account QoS H04L 67/61)}
Definition statement

This place covers:

The scheduler prioritizes the items to be transmiited according to the available network bandwidth.

References
Limiting references

This place does not cover:

Admission control, resource allocation in open networks

H04L 12/5692

Flow control in packet networks

H04L 47/10

establishing a schedule or organising the servicing of application requests taking into account QoS in data packet switching network

H04L 67/61

{involving billing parameters, e.g. priority for subscribers of premium services}
Definition statement

This place covers:

The scheduler defines priorities for the different items to be sent, for example according to billing policy (the user, who has been charged most will be served first).

{involving content or additional data duration or size, e.g. length of a movie, size of an executable file}
Definition statement

This place covers:

Duration of a movie or TV program.

{involving the time of distribution, e.g. the best time of the day for inserting an advertisement or airing a children program}
Definition statement

This place covers:

Pertains to the time of the day, week,.., for example the best time of the day for inserting a commercial or airing a program suitable for children.

{for delaying content or additional data distribution, e.g. because of an extended sport event}
Definition statement

This place covers:

A TV program is delayed because of e.g. an expanded sport event.

{for generating a list of items to be played back in a given order, e.g. playlist, or scheduling item distribution according to such list (retrieval of multimedia data based on playlists G06F 16/40)}
Definition statement

This place covers:

Generation of a playlist and scheduling content items according to a playlist.

References
Limiting references

This place does not cover:

Retrieval of multimedia data based on playlists

G06F 16/40

{for determining content or additional data repetition rate, e.g. of a file in a DVB carousel according to its importance (arrangements for broadcast or for distribution of identical information repeatedly in broadcast distribution systems H04H 20/16)}
Definition statement

This place covers:

Algorithms considering at which frequency a piece of data should be repeated in the carousel, for example according to its importance. Also pertains to data which is repeated at a constant frequency.

References
Limiting references

This place does not cover:

Arrangements for broadcast or for distribution of identical information repeatedly in broadcast distribution systems

H04H 20/16

{for distributing content or additional data in a staggered manner, e.g. repeating movies on different channels in a time-staggered manner in a near video on demand system}
Definition statement

This place covers:

Scheduling of NVOD services. Movies are repeated on different channels in a time-staggered manner.

{for associating distribution time parameters to content, e.g. to generate electronic program guide data}
Definition statement

This place covers:

Metadata, such as program descriptors is received from the content provider, which itself is not aware of a transmission schedule. Therefore the creation of the EPG data consisting of metadata and time information is performed by the scheduler. The EPG user interface for program selection by the user is classified elsewhere.

{for providing content or additional data updates, e.g. updating software modules, stored at the client (deployment, distribution, installation, update of software G06F 8/65; error detection or correction during software upgrading G06F 11/1433; arrangements for updating broadcast information or broadcast-related information H04H 60/25)}
Definition statement

This place covers:

The scheduler decides to update data or software resident in the client for example on a regular basis or according to special events.

References
Limiting references

This place does not cover:

Deployment, distribution, installation, update of software

G06F 8/65

Error detection or correction during software upgrading

G06F 11/1433

Arrangements for updating broadcast information or broadcast-related information

H04H 60/25

Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
Definition statement

This place covers:

Gathering multimedia content from different sources, analyzing it and creating appropriate channels for the clients. It receives input from the scheduler.

{for automatically generating descriptors from content, e.g. when it is not made available by its provider, using content analysis techniques}
Definition statement

This place covers:

Documents describing the processing of these descriptors, for example in case-based systems, where an incoming piece of content is classified with other similar ones.

{for generating or managing entitlement messages, e.g. Entitlement Control Message [ECM] or Entitlement Management Message [EMM] (arrangements for conditional access to broadcast information or to broadcast-related services H04H 60/14)}
Definition statement

This place covers:

Generation and management of entitlement messages in a conditional access system. Pertains to ECM and EMM only.

References
Limiting references

This place does not cover:

Arrangements for conditional access to broadcast information or to broadcast-related services

H04H 60/14

{using retrofitting techniques, e.g. by re-encrypting the control words used for pre-encryption}
Definition statement

This place covers:

Trans-encryption of the ECMs resulting from pre-encryption (or re-encryption of the control words used for pre-encryption) for use with a different transmission key, also known as encryption renewal.

{for generating or managing keys in general (key distribution for secret or secure communication involving central third party, e.g. key distribution center [KDC] or trusted third party [TTP] H04L 9/083; network support of key management H04L 63/06; key management for network security in communication control or processing H04W 12/04)}
Definition statement

This place covers:

Generation and management of keys on the server side.

References
Limiting references

This place does not cover:

Key distribution for secret or secure communication, using a key distribution center, a trusted party or a key server

H04L 9/083

Network support of key management

H04L 63/06

Key management for network security in communication control or processing

H04W 12/04

{for merging a unicast channel into a multicast channel, e.g. in a VOD application, when a client served by unicast channel catches up a multicast channel to save bandwidth (data multicast over packet-switching network H04L 12/18)}
Definition statement

This place covers:

Different channels can be merged in a single one. For example, in a VOD application, a client served by unicast channel catches up a multicast channel. Stream merging allows to minimize bandwidth.

References
Limiting references

This place does not cover:

Data multicast over packet-switching network

H04L 12/18

Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
Definition statement

This place covers:

The server controls the complexity of the video stream, for example based on its capabilities.

Gathering content from different sources, e.g. Internet and satellite
Definition statement

This place covers:

How content is retrieved from different sources (e.g. satellite and internet) or from different content providers.

Creating a channel for a dedicated end-user group, e.g. insertion of targeted commercials based on end-user profiles {(information retrieval from the Internet by querying with filtering and personalisation G06F 16/9535; arrangements for replacing or switching information during the broadcast H04H 20/10; push services over packet-switching network H04L 12/1859; adaptation of message content in packet-switching networks H04L 51/063)}
Definition statement

This place covers:

  • Generation of a personalized channel for one or a group of clients, according to their preferences. It also receives data from the scheduler.
  • Describes the insertion of targeted commercials by the server (with no further details at bitstream level).
References
Limiting references

This place does not cover:

Information retrieval from the Internet by querying with filtering and personalisation

G06F 16/9535

Arrangements for replacing or switching information during the broadcast

H04H 20/10

Push services over packet-switching network

H04L 12/1859

Adaptation of message content in packet-switching networks

H04L 51/063

Server based end-user applications
Definition statement

This place covers:

Aplications, where the end-user is aware that they are residing on the server, such as video hosting.

Storing end-user {multimedia} data in response to end-user request {, e.g. network recorder}
Definition statement

This place covers:

Storage of private data based on the explicit request of the end-user. The server holds private data received from the client as an extra service.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Management of user data for administrative purposes

H04N 21/25866

Video hosting of uploaded data from client
Definition statement

This place covers:

Personal videos have been uploaded by clients, for example to be viewed by other users.

Remote storage of video programs received via the downstream path, e.g. from the server
Definition statement

This place covers:

The source can be a storage dedicated for each user for example to record movies if the capacity of his hard disk is not sufficient.

Content descriptor database or directory service for end-user access {(details of content or meta data based information retrieval of video data in video databases G06F 16/739)}
Definition statement

This place covers:

Creation of directory services, for example by indexing metadata for easy retrieval (keyword search of movies).

References
Limiting references

This place does not cover:

Details of content or meta data based information retrieval of video data in video databases

G06F 16/739

Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
Definition statement

This place covers:

Subject matter directed to the structure or operation of the client or end user device, such as a TV receiver, a set-top-box, a PC-TV, a mobile appliance (e.g. mobile phone or vehicle), being defined as receiving video and possibly additional data from one or several servers or intermediate component via network for further processing, storing and displaying. This also includes the transmission of these data on a home-based local network to further devices. The client extracts the raw multimedia content from the streams received from possibly heterogeneous sources (e.g. internet, broadcast network) and only provides a network interface. Exchange of control signals with the server or the network are placed by definition in the T-model as well as the uploading of client data to the server.

The first layer of this subgroup pertains to the physical description of the client and attached devices, e.g. its internal components, plug-in cards, input means, peripherals.

The second layer is directed to elementary specialized functions such as the processing of the data received from the downstream network interface (e.g. channel decoding, descrambling) and transmitted from the upstream network interface, the demultiplexing into elementary streams and the processing thereof, the extraction of additional data, the local storage of the content within the client device or its forwarding to peripheral devices via a local network, the combined display of several pieces of content on the same screen (e.g. news ticker, advertisement in a separate window), the monitoring of e.g. internal processes, user actions, network bandwidth. This layer also encompasses the software structure of the client device.

The third layer describes high level functions such as the selection of content (e.g. in unidirectional systems where the whole content is sent to the client), the management of content usage (e.g. conditional access, rights), the creation of local virtual channels (e.g. by combining streams retrieved from the broadcast network and the hard disk), the adaptation by learning of internal parameters (e.g. viewer profile).

The last layer is directed to services or applications as provided to the end user of the system such as defining setting parameters, selecting programs, making requests to the server or accessing additional services (e.g. banking, shopping, WWW browsing, gaming).

The subgroup is directed to documents related to the reception and processing of received data.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Arrangements for distribution where lower stations, e.g. receivers, interact with the broadcast

H04H 20/38

Arrangements specially adapted for receiving broadcast information

H04H 40/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Raw multimedia data per se

H04N 21/80

Structure of client; Structure of client peripherals
Definition statement

This place covers:

Hardware level. Physical description of the multimedia client.

Special rules of classification

As most of the components are always present, e.g. tuner or memory, there is no need to describe them all to avoid unnecessary classification work. An index should be allocated only if one of the component has a critical function in the invention.

It should be further noted, that most of the components have already an entry in other technical fields and that, for example, the circuitry of a tuner is not part of this model.

{Peripherals receiving signals from specially adapted client devices}
Definition statement

This place covers:

A peripheral is considered here as an external device, which receives multimedia data from the client (which excludes cards) being in the immediate vicinity of the client (same room or house). The most common example is the video recorder but other devices on a home network are possible. Covered are also PDA, game console or remote controls with a display.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Input-only peripherals

H04N 21/422

Communication aspects with peripherals

H04N 21/436

{characterised by an identification number or address, e.g. local network address (protecting specific internal or external computer components using identification number G06F 21/73; network arrangements, protocols or services for addressing or naming H04L 61/00)}
Definition statement

This place covers:

Identification number of the peripheral device, network address on the local network.

References
Limiting references

This place does not cover:

Protecting specific internal or external computer components used for computing or processing information by creating or determining hardware identification

G06F 21/73

Network arrangements in data packet switching network, protocols or services for addressing or naming

H04L 61/00

{PC}
Definition statement

This place covers:

The client device, typically a STB here, is connected to a personal computer to extract data or software multiplexed with the video signal and to forward it to a PC.

{for generating hard copies of the content, e.g. printer, electronic paper (interfaces to printers G06F 3/12; printing data G06K 15/02)}
Definition statement

This place covers:

The printer can be used for printing coupons or any additional data received by the STB. Covers also electronic paper.

References
Limiting references

This place does not cover:

Interfaces to printers

G06F 3/12

Printing data

G06K 15/02

{additional display device, e.g. video projector (digital output for controlling a plurality of local displays G06F 3/1423)}
Definition statement

This place covers:

Additional display device, e.g. projector, not being the main display device, e.g. TV set, which is always present.

References
Limiting references

This place does not cover:

Digital output for controlling a plurality of local displays

G06F 3/1423

{The peripheral being portable, e.g. PDAs or mobile phones}
Definition statement

This place covers:

Device receiving data from the client device, being typically a remote control with a display, a PDA or a mobile phone. Excludes PC, printer, additional display, recorder.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Constructional details of equipment or arrangements specially adapted for portable computer application

G06F 1/1626

{home appliance, e.g. lighting, air conditioning system, metering devices (home automation data switching networks exchanging configuration information on appliance services H04L 12/2807)}
Definition statement

This place covers:

A home appliance can be a lighting or an air conditioning system or metering devices.

References
Limiting references

This place does not cover:

Home automation data switching networks exchanging configuration information on appliance services

H04L 12/2807

{external recorder (interface circuits between an apparatus for recording television signals and a television receiver H04N 5/775)}
Definition statement

This place covers:

A recording device can be a VCR, an external hard disk, or DVD players as source of video or additional. Personal video recorders with an internal hard disk are covered elsewhere.

References
Limiting references

This place does not cover:

Interface circuits between an apparatus for recording television signals and a television receiver

H04N 5/775

Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
Definition statement

This place covers:

different emodiments of video client platforms.

{embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop (constructional details of equipment or arrangements specially adapted for portable computer application G06F 1/1626; arrangements specially adapted for mobile receivers in broadcast systems H04H 20/57)}
Definition statement

This place covers:

The client device is a mobile phone, a PDA or any portable device.

References
Limiting references

This place does not cover:

Constructional details of equipment or arrangements specially adapted for portable computer application

G06F 1/1626

Arrangements specially adapted for mobile receivers in broadcast systems

H04H 20/57

{involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk}
Definition statement

This place covers:

Display device viewable by several users in a public space outside their home, e.g. movie theater or information kiosk. Excludes access points for downloading information.

{located in transportation means, e.g. personal vehicle (arrangements specially adapted for transportation systems in broadcast systems H04H 20/62)}
Definition statement

This place covers:

The client device is located in a vehicle, e.g. car entertainment systems.

References
Limiting references

This place does not cover:

Arrangements specially adapted for transportation systems in broadcast systems

H04H 20/62

{embedded in a} Personal Computer [PC]
Definition statement

This place covers:

The client device is a personal computer but not a portable device.

PVR [Personal Video Recorder] (H04N 5/76 takes precedence {; arrangements for broadcast with accumulation-type receivers H04H 20/40})
Definition statement

This place covers:

The client device is a personal video recorder, STB with hard disk.

References
Limiting references

This place does not cover:

Television signal recording

H04N 5/76

Arrangements for broadcast with accumulation-type receivers

H04H 20/40

External card to be used in combination with the client device, e.g. for conditional access
Definition statement

This place covers:

Cards being external components, which can be inserted in a dedicated slot, e.g. smart cards for a conditional access system or extension modules to upgrade the STB capabilities.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Interfacing a plurality of external cards, e.g. through a DVB Common Interface

H04N 21/43607

{for conditional access}
Definition statement

This place covers:

Cards holding a key for Conditional Access purposes, e.g. descrambling, or other decryption operations.

{for identification purposes, e.g. storing user identification data, preferences, personal settings or data (restricting access to computer systems by authenticating users using a predetermined code in combination with an additional device, e.g. dongle or smart card G06F 21/123)}
Definition statement

This place covers:

Card holding identification data of the user, preferences, personal settings or any kind of personal data.

References
Limiting references

This place does not cover:

Restricting access to computer systems by authenticating users using a predetermined code in combination with an additional device, e.g. dongle or smart card

G06F 21/123

{providing its own processing capabilities, e.g. external module for video decoding}
Definition statement

This place covers:

Cards having its own processing capabilities, e.g. external module for video decoding.

{providing storage capabilities, e.g. memory stick}
Definition statement

This place covers:

Extension module or storage capabilities, e.g. memory sticks.

for payment
Definition statement

This place covers:

Bank, credit card or prepaid card, to be used e.g. in TV shopping applications.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock

H04N 21/43

Information-bearing cards or sheet-like structures characterised by identification or security features for use in combination with accessories specially adapted for information-bearing cards

B42D 25/22

Payment schemes, architectures or protocols

G06Q 20/00

Payment architectures where the payment is settled via telecommunication systems

G06Q 20/16

Payment architectures, schemes or protocols characterised by the use of cards

G06Q 20/34

E-commerce

G06Q 30/00

Mechanisms actuated by coded identity card or credit card to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus

G07F 7/08

Charging arrangements in data networks

H04L 12/14

Input-only peripherals {, i.e. input devices connected to specially adapted client devices}, e.g. global positioning system [GPS] {(input devices also receiving signals from specially adapted client devices H04N 21/4104)}
Definition statement

This place covers:

Devices used to send information and control signals from the user and its environment to the client. It includes remote controls, keyboards, mouses, microphones as well as cameras or biosensors.

References
Limiting references

This place does not cover:

Peripherals receiving signals from client devices

H04N 21/4104

Informative references

Attention is drawn to the following places, which may be of interest for search:

Input arrangements or combined input and output arrangements for interaction between user and computer

G06F 3/01

{biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user (input arrangements for interaction with the human body based on nervous system activity detection G06F 3/015)}
Definition statement

This place covers:

Can be used as passive input from the user. Such devices can be heat sensors for presence detection, EEG sensors or any limb activity sensors worn by the user.

References
Limiting references

This place does not cover:

Input arrangements for interaction with the human body based on nervous system activity detection

G06F 3/015

{environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes}
Definition statement

This place covers:

Sensors connected to the client, allowing to determine temperature, luminosity, pressure or earthquakes. Includes position sensors, e.g. GPS.

{sound input device, e.g. microphone}
Definition statement

This place covers:

Any sound input device can be used to generate audio streams or to enter voice commands.

{User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor (constructive details of casings for the remote control device H01H 9/0235; user interfaces for controlling a tuning device of a television receiver through a remote control H03J 9/00; remote control of peripheral devices connected to a television receiver through the remote control device of the television receiver H04B 1/205)}
Definition statement

This place covers:

Only remote control devices transmitting input data to the client device and located in the direct vicinity thereof.

References
Limiting references

This place does not cover:

Constructive details of casings for the remote control device

H01H 9/0235

User interfaces for controlling a tuning device of a television receiver through a remote control

H03J 9/00

Remote control of peripheral devices connected to a television receiver through the remote control device of the television receiver

H04B 1/205

Informative references

Attention is drawn to the following places, which may be of interest for search:

Computer pointing devices in general

G06F 3/033

Interaction techniques for graphical user interfaces in general

G06F 3/048

Remote control devices in general

G08C

{Touch pad or touch panel provided on the remote control}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Touch pads in general

G06F 3/03547

Cameras (H04N 23/00 takes precedence)
Definition statement

This place covers:

Cameras allowing the client to become a video source, e.g. for uploading videos to a server or for identification purposes.

References
Limiting references

This place does not cover:

Television cameras

H04N 23/00

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

Camera

Camera has the meaning of image generating device and covers also scanner (paper, fingerprint, retina) or any kind of imaging device. The camera allows the client to become a video source. It can be used for identifying the user or uploading videos to the server.

{Providing} Remote input by a user located remotely from the client device, e.g. at work
Definition statement

This place covers:

The client device is controlled by input devices located at a distant location. A possible application could be that a user uses a mobile phone, a PDA or an office PC to program his STB at home. Input and client device are connected by a wide area network, e.g. internet.

Internal components of the client {; Characteristics thereof} (H04N 5/44 takes precedence)
Definition statement

This place covers:

  • Internal components such as tuner, demodulator, demultiplexer, descrambler, video/audio decoder, CPU, volatile memory, hard disk, graphics board/circuitry, modem. It should be noted that certain components are typical for a multimedia client, like the ones used for video processing or the receiver circuitry.
  • Additional built-in cards.
References
Limiting references

This place does not cover:

Receiver circuitry

H04N 5/44

{for processing the incoming bitstream}
Definition statement

This place covers:

Pieces of hardware processing the incoming bitstream.

{involving specific tuning arrangements, e.g. two tuners}
Definition statement

This place covers:

The presence of at least 2 tuners in a client device.

{for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM}
Definition statement

This place covers:

Internal reader / writer for DVD's, CD-ROM's and similar disks.

{the medium being removable}
Definition statement

This place covers:

Removable hard disk within a client.

{Client identification by a unique number or address, e.g. serial number, MAC address, socket ID (network arrangements, protocols or services for addressing or naming H04L 61/00)}
Definition statement

This place covers:

Hardware identification or serial number, also MAC address, socket ID.

References
Limiting references

This place does not cover:

Network arrangements in data packet switching network, protocols or services for addressing or naming

H04L 61/00

Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware {(real-time communication protocols in data switching networks H04L 65/00)}
Definition statement

This place covers:

Elementary specialized functions. They can be implemented in software or hardware. Their task is to control the corresponding hardware component and to provide a service to the upper layer.

References
Limiting references

This place does not cover:

Real-time communication protocols in data switching networks

H04L 65/00

{Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets (arrangements for synchronising receiver with transmitter by comparing receiver clock with transmitter clock H04L 7/0012; arrangements for synchronising receiver with transmitter wherein the receiver takes measures against momentary loss of synchronisation H04L 7/0083)}
Definition statement

This place covers:

Clock recovery, e.g. extraction of the PCR packets.

References
Limiting references

This place does not cover:

Arrangements for synchronising receiver with transmitter by comparing receiver clock with transmitter clock

H04L 7/0012

Arrangements for synchronising receiver with transmitter wherein the receiver takes measures against momentary loss of synchronisation

H04L 7/0083

{Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen}
Definition statement

This place covers:

Synchronised presentation of the multimedia content according to the time stamps. Additional data can be synchronized to the main content. Also locking items at given times.

Generation of visual interfaces {for content selection or interaction}; Content or additional data rendering
Definition statement

This place covers:

Details of the generation of visual interfaces on a video client, involving graphical features, screen space management. They must be differentiated from applications making use of them such as EPG

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Receiver circuitry for displaying additional information

H04N 5/445

End-user applications using user interfaces

H04N 21/47

Interaction techniques for graphical user interfaces

G06F 3/048

{involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations}
Definition statement

This place covers:

Layout arrangement on the screen, overlays, menus in general.

{for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid}
Definition statement

This place covers:

Creation of a grid for all the textual information is fitted, e.g. in a rectangular grid.

{for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window}
Definition statement

This place covers:

A separate window or region is provided to display additional data, like a commercial or additional text.

{by altering the content in the rendering process, e.g. blanking, blurring or masking an image region (image enhancement or restoration in general G06T 5/00)}
Definition statement

This place covers:

  • The displayed image can be altered according to certain parameters provided for example by an access control or censoring system. The image can be totally blanked or blurred or a region can be masked.
  • Only details of the filtering action.
  • Damping of brightness.
References
Limiting references

This place does not cover:

Image enhancement or restoration in general

G06T 5/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

High-level filtering performed on a region of the image

H04N 21/45455

Content retrieval operation from a local storage medium, e.g. hard-disk {(details of retrieval of video data and associated meta data in video databases G06F 16/739)}
Definition statement

This place covers:

Retrieval from local storage.

References
Limiting references

This place does not cover:

Details of retrieval of video data and associated meta data in video databases

G06F 16/739

{by playing back content from the storage medium (reproduction of recorded television signals H04N 5/76; reproduction of recorded television signals H04N 9/79)}
Definition statement

This place covers:

Playback of media data from fixed or removable local storage devices.

References
Limiting references

This place does not cover:

Reproduction of recorded television signals

H04N 5/76

Reproduction of recorded television signals

H04N 9/79

Content storage operation, e.g. storage operation in response to a pause request, caching operations
Definition statement

This place covers:

Local storage. The client uses part of his volatile or nonvolatile memory, e.g. hard disk, to store a part of the received multimedia data or data it has generated itself, e.g. monitored data.

{Caching operations, e.g. of an advertisement for later insertion during playback}
Definition statement

This place covers:

Caching operations, for example of commercials for later insertion or the generation of an internal database, for example for holding EPG data. The storage has a temporary aspect and must be distinguished from recording, which aims more at a long term archiving on request of the user. It is not meant either to describe buffering as performed in the video decoder, which holds the multimedia data for a brief period of time. Caching is an action, which is transparent to the end-user unlike recording.

{by placing content in organized collections, e.g. local EPG data repository (interfaces, Database management systems or updating for information retrieval G06F 16/23; details of retrieval of video data and associated meta data in video database G06F 16/739)}
Definition statement

This place covers:

Particular details of the use of a local database. Also generation of directory structure, within the file system of the client device.

References
Limiting references

This place does not cover:

Interfaces, Database management systems or updating for information retrieval

G06F 16/23

Details of retrieval of video data and associated meta data in video database

G06F 16/739

{Processing operations in response to a pause request}
Definition statement

This place covers:

The incoming video stream, e.g. from live broadcast, can be paused. It is stored locally to allow the user to resume viewing later on.

{Recording operations (recording of a television signal H04N 5/76; arrangements for recording or accumulating broadcast information or broadcast-related information H04H 60/27)}
Definition statement

This place covers:

Recording of received data for archiving purposes, i.e. permanent; not caching, which has a temporary aspect.

References
Limiting references

This place does not cover:

Recording of a television signal

H04N 5/76

Arrangements for recording or accumulating broadcast information or broadcast-related information

H04H 60/27

Housekeeping operations, e.g. prioritizing content for deletion because of storage space restrictions {(storage management, e.g. defragmentation G06F 3/0604; unloading stored programs G06F 9/445; storage management in file systems G06F 16/10; buffering arrangements in a network node or in an end terminal in packet networks H04L 49/90)}
Definition statement

This place covers:

The client device has a limited memory. Algorithms, describing which data are prioritized for deletion, e.g. oldest or less used data.

References
Limiting references

This place does not cover:

Storage management, e.g. defragmentation

G06F 3/0604

Unloading stored programs

G06F 9/445

Storage management in file systems

G06F 16/10

Buffering arrangements in a network node or in an end terminal in packet networks

H04L 49/90

Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream {(demultiplexing of data packets for data networks, e.g. RTP/UDP H04L 65/00)}
Definition statement

This place covers:

Transport stream demultiplexing. It takes as input transport streams and generates after demultiplexing A/V streams or remultiplexes several TS into a new transport stream. Demultiplexing includes PID filtering.

References
Limiting references

This place does not cover:

Multiplexing of data packets for data networks

H04L 65/00

{Demultiplexing isochronously with video sync, e.g. according to bit-parallel or bit-serial interface formats, as SDI}
Definition statement

This place covers:

Isochronously with the horizontal video sync, according to bit-parallel or bit-serial interface formats.

{Remultiplexing of multiplex streams, e.g. by modifying time stamps or remapping the packet identifiers}
Definition statement

This place covers:

Modification of bitstream parameters, e.g. re-stamping, trans-multiplexing, or remapping of PIDs.

{Extraction or processing of SI, e.g. extracting service information from an MPEG stream}
Definition statement

This place covers:

Retrieval of the system informations (SI).

{involving stuffing data, e.g. packets or bytes (synchronisation arrangements in time-division multiplex systems with different or fluctuating information rates H04J 3/073)}
Definition statement

This place covers:

Extraction of the software or additional data that have been inserted in the packetised stream by replacement of (or by using the bandwidth occupied by) the stuffing bits/bytes/packets.

References
Limiting references

This place does not cover:

Synchronisation arrangements in time-division multiplex systems with different or fluctuating information rates

H04J 3/073

{Demultiplexing of additional data and video streams}
Definition statement

This place covers:

Extraction of the additional data from a digital video stream.

{by extracting from data carousels, e.g. extraction of software modules from a DVB carousel}
Definition statement

This place covers:

Extraction process of the data out of the DVB carousel.

Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
Definition statement

This place covers:

Software, additional data and generally speaking to non-streaming data. This part is dedicated to the retrieval of software modules and non audio-video information, such as additional data (descriptors, WWW pages,..)for example by extracting them from a DVB carousel or received from an internet site. Modules are reordered according to a directory module, checked for consistency and eventually the complete package is rebuilt.

{involving reassembling additional data, e.g. rebuilding an executable program from recovered modules}
Definition statement

This place covers:

After recovery, all modules are ordered and the initial package rebuilt.

{involving decryption of additional data (arrangements using cryptography for the use of broadcast information or broadcast-related information H04H 60/23)}
References
Limiting references

This place does not cover:

Arrangements using cryptography for the use of broadcast information or broadcast-related information

H04H 60/23

{involving reformatting operations of additional data, e.g. HTML pages on a television screen (optimising the visualization of content for information retrieval from the Internet G06F 16/9577; adaptation of message content in packet-switching networks H04L 51/066; media handling at the source in data packet switching networks H04L 65/764)}
Definition statement

This place covers:

Additional informations such as an HTML page are reformatted by the client device.

References
Limiting references

This place does not cover:

Optimising the visualization of content for information retrieval from the Internet

G06F 16/9577

Adaptation of message content in packet-switching networks

H04L 51/066

Media handling at the source in data packet switching networks

H04L 65/764

{by altering the spatial resolution, e.g. to reformat additional data on a handheld device, attached to the STB}
Definition statement

This place covers:

The resolution of the additional informations is modified. It can be used, e.g. to reformat additional data on a handheld device, attached to the STB.

{for generating different versions, e.g. for different peripheral devices}
Definition statement

This place covers:

The client device generates at least one other version of the original additional data, which is available together with the original version.

Interfacing a local distribution network, e.g. communicating with another STB {or one or more peripheral devices} inside the home
Definition statement

This place covers:

  • A series of interfaces allowing to communicate with cards and peripheral devices. It includes for example the DVB common interface, secure local communication, e.g. with a smart card or a video recorder, via IEEE 1394 connection to other video devices.
  • Communication aspects with these devices when the client becomes a home server.
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Arrangements specially adapted plural spots in a confined site in broadcast systems

H04H 20/63

{Interfacing a plurality of external cards, e.g. through a DVB Common Interface [DVB-CI]}
Definition statement

This place covers:

Connection via common interface (DVB-CI), multiple conditional access.

{Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals (home Audio Video Interoperability [HAVI] data switching networks H04L 12/2805)}
Definition statement

This place covers:

  • Several peripherals connected to a home network.
  • Documents describing communication aspects on the home network or describing a home network with no special emphasis on the connected peripheral devices.
References
Limiting references

This place does not cover:

Home Audio Video Interoperability (HAVI) data switching networks

H04L 12/2805

{Interfacing an external recording device}
Definition statement

This place covers:

Communication between the client and an external connected recording device.

Adapting the video stream to a specific local network, e.g. a Bluetooth® network
Definition statement

This place covers:

Adaptation of the bitstream to the local network, e.g. transport of video over firewire.

{involving a wired protocol, e.g. IEEE 1394 (high-speed IEEE 1394 serial bus H04L 12/40052)}
Definition statement

This place covers:

Devices and peripherals connected via a firewire (IEEE1394) link.

References
Limiting references

This place does not cover:

High-speed IEEE 1394 serial bus

H04L 12/40052

{HDMI}
Definition statement

This place covers:

Devices and peripherals connected via an HDMI connection.

{involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11] (arrangements for wireless networking or broadcasting of information in indoor or near-field type systems H04B 10/114)}
References
Limiting references

This place does not cover:

Arrangements for wireless networking or broadcasting of information in indoor or near-field type systems

H04B 10/114

Informative references

Attention is drawn to the following places, which may be of interest for search:

Wireless local area data switching networks

H04W

Flow control in wireless networks

H04W 28/10

Establishing a secure communication between the client and a peripheral device or smart card
Definition statement

This place covers:

Secure communication with the peripheral or with a smart card.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Security arrangements for protecting computers or computer systems against unauthorised activity

G06F 21/00

Arrangements for secret or secure communication per se

H04L 9/00

Security arrangements in wireless networks

H04W 12/00

Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server {(flow control in data networks H04L 47/10; streaming protocols, e.g. RTP or RTCP, H04L 65/65; scheduling or organising the servicing of application requests in data packet switching networks H04L 67/60)}
Definition statement

This place covers:

Non-physical details of phone or cable modems. Communication aspects with the server are to be found elsewhere.

References
Limiting references

This place does not cover:

Flow control in data networks

H04L 47/10

Streaming protocols, e.g. RTP or RTCP

H04L 65/65

Scheduling or organising the servicing of application requests in data packet switching networks

H04L 67/60

Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
Definition statement

This place covers:

The Downstream network interface processes the electromagnetic waves received from the network and outputs multimedia streams. It comprises channel tuning to get the baseband signal, channel decoding, descrambling.

References
Limiting references

This place does not cover:

Transmission of MPEG streams over ATM

H04L 12/5601

Flow control in data networks

H04L 47/10

Real-time communication protocols in data switching networks

H04L 65/00

{Recovering the multiplex stream from a specific network, e.g. recovering MPEG packets from ATM cells (transmission of MPEG streams over ATM H04L 12/5601)}
Definition statement

This place covers:

The bitstream is adapted to a specific network. The type of network or protocol used is classified elsewhere.

References
Limiting references

This place does not cover:

Transmission of MPEG streams over ATM

H04L 12/5601

{Demodulation or channel decoding, e.g. QPSK demodulation (analog front ends or means for connecting modulators, demodulators or transceivers to a transmission line H04L 27/0002)}
Definition statement

This place covers:

Demodulation and error correction.

References
Limiting references

This place does not cover:

Analog front ends or means for connecting modulators, demodulators or transceivers to a transmission line

H04L 27/0002

{Accessing a communication channel}
Definition statement

This place covers:

Channel selection.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Tuning indicators; Automatic tuning control

H04N 5/50

{involving operations to reduce the access time, e.g. fast-tuning for reducing channel switching latency}
Definition statement

This place covers:

Fast channel change or rapid tuning relate to techniques, where the STB tries to display as quick as possible an image in the time interval starting after the user has issued a channel change command and before the decoding buffer could be filled. Those techniques comprise of, for example, decoding a low resolution stream or a stream sent at a higher rate.

Multiplex stream processing, e.g. multiplex stream decrypting
Definition statement

This place covers:

Processing of the transport stream as received from the network and before being sent to the demultiplexer.

{involving multiplex stream decryption (arrangements using cryptography for the use of broadcast information or broadcast-related information H04H 60/23)}
Definition statement

This place covers:

Only the descrambling/decrypting of the transport stream.

References
Limiting references

This place does not cover:

Arrangements using cryptography for the use of broadcast information or broadcast-related information

H04H 60/23

Informative references

Attention is drawn to the following places, which may be of interest for search:

Multiplex stream encryption

H04N 21/23895

Video stream decryption

H04N 21/4405

Processing of audio elementary streams
Definition statement

This place covers:

Audio stream management.

{involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams (arrangements characterised by components specially adapted for monitoring, identification or recognition of audio in broadcast systems H04H 60/58)}
Definition statement

This place covers:

The audio stream is parsed to extract or recognize some features or to detect embedded triggers.

References
Limiting references

This place does not cover:

Arrangements characterised by components specially adapted for monitoring, identification or recognition of audio in broadcast systems

H04H 60/58

{by muting the audio signal}
Definition statement

This place covers:

The audio stream is muted because, for example, rights have been violated or for censoring purposes.

{involving reformatting operations of audio signals (details of audio signal transcoding G10L 19/173)}
Definition statement

This place covers:

Reformatted audio stream, e.g. by converting from one coding standard to another.

References
Limiting references

This place does not cover:

Details of audio signal transcoding

G10L 19/173

Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
Definition statement

This place covers:

  • Video stream management. Receives the video stream from the demultiplexer and performs MPEG decoding, synchronization with other streams.
  • Management of the video decoder buffer.
{involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream (arrangements characterised by components specially adapted for monitoring, identification or recognition of video in broadcast systems H04H 60/59)}
Definition statement

This place covers:

Detection of features (e.g. logo) in a video stream, extraction of characteristics or generation of metadata in the client directly from the video stream.

References
Limiting references

This place does not cover:

Arrangements characterised by components specially adapted for monitoring, identification or recognition of video in broadcast systems

H04H 60/59

{involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs}
Definition statement

This place covers:

  • Spatial composition of a scene according to the scene graph in the rendering process. Scene graph updating following client/user control is covered here as well as the animation of objects.
  • Specific to the processing of MPEG-4 objects.
{involving splicing one content stream with another content stream, e.g. for substituting a video clip}
Definition statement

This place covers:

Splicing of at least one video stream with another stream (video or not) at the client level. It can be used for inserting or substituting a piece of video such as a commercial.

involving reformatting operations of video signals for household redistribution, storage or real-time display {(details of conversion of video standards at pixel level H04N 7/01; video transcoding H04N 19/40; adapting incoming signals to the display format of the display terminal G09G 5/005; media handling at the source in data packet switching networks H04L 65/764)}
References
Limiting references

This place does not cover:

Details of conversion of video standards at pixel level

H04N 7/01

Video transcoding

H04N 19/40

Adapting incoming signals to the display format of the display terminal

G09G 5/005

Media handling at the source in data packet switching networks

H04L 65/764

{for formatting on an optical medium, e.g. DVD}
Definition statement

This place covers:

The MPEG stream is preprocessed for formatting and recording on a DVD.

{by decomposing into layers, e.g. base layer and one or more enhancement layers}
Definition statement

This place covers:

The client device generates a layered video stream from the original one.

{by media transcoding, e.g. video is transformed into a slideshow of still pictures, audio is converted into text}
Definition statement

This place covers:

Transcoding between modalities, e.g. audio to text.

{the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment}
Definition statement

This place covers:

The reformatting operation is performed on part of the stream, the part being spatial region of the image or a time segment.

{by altering signal-to-noise parameters, e.g. requantization}
Definition statement

This place covers:

New quantization parameters are introduced allowing to change the resolution of each video frame.

{by altering the spatial resolution, e.g. for displaying on a connected PDA}
Definition statement

This place covers:

Client-side alteration of the spatial resolution, mainly for dispalying on peripheral device.

{for performing aspect ratio conversion}
Definition statement

This place covers:

Conversion of signal for displaying with a different aspect ratio or with a different resolution.

{by altering the temporal resolution, e.g. by frame skipping (television signal recording using magnetic recording on tape for reproducing at a rate different from the recording rate H04N 5/783)}
Definition statement

This place covers:

Alteration of the frame rate.

References
Limiting references

This place does not cover:

Television signal recording using magnetic recording on tape for reproducing at a rate different from the recording rate

H04N 5/783

{for generating different versions}
Definition statement

This place covers:

Client devices generating at least one other version of the original content, which is available together with the original version.

involving video stream decryption
Definition statement

This place covers:

Descrambling of the video stream, decryption of the content stream.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements using cryptography for the use of broadcast information or broadcast-related information

H04H 60/23

Arrangements for secret or secure communication

H04L 9/00

{by partially decrypting, e.g. decrypting a video stream that has been partially encrypted}
Definition statement

This place covers:

Not all of the signal is scrambled or different parts are encrypted differently, e.g. to reduce processor load or to enable a reduced quality presentation.

involving video stream encryption, e.g. re-encrypting a decrypted video stream for redistribution in a home network
Definition statement

This place covers:

Client devices re-encrypting the decrypted video stream, e.g. with another key. It can be used for secure recording.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements using cryptography for the use of broadcast information or broadcast-related information

H04H 60/23

Arrangements for secret or secure communication

H04L 9/00

Acquiring end-user identification {, e.g. using personal code sent by the remote control or by inserting a card}
Definition statement

This place covers:

The user identification can be used to retrieve his settings, viewing preferences or in financial transactions. Monitoring of the user actions are to be classified elsewhere.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Restricting access to computer systems by authenticating users using a predetermined code

G06F 21/33

Authentication in wireless communication networks

H04W 12/06

using biometric characteristics of the user, e.g. by voice recognition or fingerprint scanning {(cryptography using biological data H04L 9/3231; authentication in networks using biometric H04L 63/0861)}
Definition statement

This place covers:

The user is passively identified by facial/fingerprint/voice recognition.

References
Limiting references

This place does not cover:

Cryptography using biological data

H04L 9/3231

Authentication in networks using biometric

H04L 63/0861

Informative references

Attention is drawn to the following places, which may be of interest for search:

Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints

G06F 18/00, G06V 20/00

Restricting access to computer systems by authenticating users using biometric data

G06F 21/32

Authentication in wireless network security

H04W 12/06

Special rules of classification

When classifying in this group, a corresponding symbol should be added to describe the type of sensor / input device used.

Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk {(arrangements for monitoring broadcast services or broadcast-related services H04H 60/29; arrangements for identifying or recognising characteristics with a direct linkage to broadcast information H04H 60/35; monitoring of user activities for profile generation for accessing a video database G06F 16/739; monitoring in wireless networks H04W 24/00)}
Definition statement

This place covers:

Monitoring is an internal process, which checks permanently user inputs, the bandwidth available at the different network interfaces or any internal processes. It can generate history data, which is later processed for example by a recommender system to build automatically a user profile (implicit profile). Since profiles can be also defined explicitely by the user via a menu, only the passive monitoring of user selections are to be classified here. Creation of explicit profiles is indexed elsewhere.

References
Limiting references

This place does not cover:

Monitoring of user activities for profile generation for accessing a video database

G06F 16/739

Arrangements for monitoring broadcast services or broadcast-related services

H04H 60/29

Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information

H04H 60/35

Monitoring in wireless networks

H04W 24/00

{Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched (monitoring of user activities for profile generation for accessing a video database G06F 16/739; protecting generic digital content where the protection is independent of the precise nature of the content G06F 21/10; arrangements for monitoring the use made of the broadcast services in broadcast systems H04H 60/31)}
Definition statement

This place covers:

Clients keeping track of how often a piece of content has been viewed or copied or recorded. Covers also records of content ID, percentage of viewed/recorded program.

References
Limiting references

This place does not cover:

Monitoring of user activities for profile generation for accessing a video database

G06F 16/739

Protecting generic digital content where the protection is independent of the precise nature of the content

G06F 21/10

Arrangements for monitoring the use made of the broadcast services in broadcast systems

H04H 60/31

{Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network (arrangements for maintenance or administration in data switching networks involving bandwidth and capacity management H04L 41/0896)}
Definition statement

This place covers:

The connection to the server is monitored, for example availability, bandwidth.

Example(s) of documents found in this subgroup: US 2008/0104653 A1

References
Limiting references

This place does not cover:

Measuring or estimating channel quality parameters

H04B 17/309

Arrangements for maintenance or administration in data switching networks involving bandwidth and capacity management

H04L 41/0896

{Monitoring of end-user related data (arrangements for monitoring the users' behaviour or opinions in broadcast systems H04H 60/33)}
References
Limiting references

This place does not cover:

Arrangements for monitoring the users' behaviour or opinions in broadcast systems

H04H 60/33

{Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program (methods or arrangements for recognising human body or animal bodies or body parts G06V 40/10; methods or arrangements for acquiring or recognising human faces, facial parts, facial sketches, facial expressions G06V 40/16; methods or arrangements for recognising movements or behaviour G06V 40/20; arrangements for identifying users in broadcast systems H04H 60/45)}
Definition statement

This place covers:

Sensors are used to detect for example the presence of individuals in front of the TV set, as well as if somebody is entering or leaving the room. Also user reactions, like movements, face expression should be classified here.

References
Limiting references

This place does not cover:

Methods or arrangements for recognising human body or animal bodies or body parts

G06V 40/10

Methods or arrangements for acquiring or recognising human faces, facial parts, facial sketches, facial expressions

G06V 40/16

Methods or arrangements for recognising movements or behaviour

G06V 40/20

Arrangements for identifying users in broadcast systems

H04H 60/45

{Analytics of user selections, e.g. selection of programs or purchase activity (monitoring of user selections in data processing systems G06F 11/34; arrangements for monitoring the user's behaviour or opinions in broadcast systems H04H 60/33)}
Definition statement

This place covers:

User selections using for example a remote control are monitored. It covers the selection of programs, duration of viewing, purchase activity, setting reminders answers to quiz, questionaires, advertisements. A log file is generated. Covers clickstream.

References
Limiting references

This place does not cover:

Monitoring of user selections in data processing systems

G06F 11/34

Arrangements for monitoring the user's behaviour or opinions in broadcast systems

H04H 60/33

Informative references

Attention is drawn to the following places, which may be of interest for search:

Monitoring of user activities for profile generation for accessing a video database

G06F 16/78

Lawful interception

H04L 63/30

Tracking the activity of the user

H04L 67/535

{Monitoring of local network, e.g. connection or bandwidth variations; Detecting new devices in the local network (configuring of peripheral devices in general G06F 9/4411; monitoring connectivity in data switched networks H04L 43/0811)}
Definition statement

This place covers:

The status of the connection or bandwidth variations on the local network are monitored. Also, detection of new devices in the local network.

References
Limiting references

This place does not cover:

Configuring of peripheral devices in general

G06F 9/4411

Monitoring connectivity in data switched networks

H04L 43/0811

{Monitoring of peripheral device or external card, e.g. to detect processing problems in a handheld device or the failure of an external recording device (configuring of peripheral devices in general G06F 9/4411; reporting information sensed by appliance or service execution status of appliance services in a home automation network H04L 12/2823; monitoring the status of connected device in data switched networks H04L 43/0817)}
Definition statement

This place covers:

The status of the connected peripheral devices is monitored, e.g. to detect the failure of a VCR or the hard disk problem of an external storage device.

References
Limiting references

This place does not cover:

Configuring of peripheral devices in general

G06F 9/4411

Reporting information sensed by appliance or service execution status of appliance services in a home automation network

H04L 12/2823

Monitoring the status of connected device in data switched networks

H04L 43/0817

{Monitoring of piracy processes or activities (protecting computer platforms against harmful, malicious or unexpected behaviour or activities using intrusion detection and counter measures G06F 21/566; computer virus detection and handling G06F 21/56)}
References
Limiting references

This place does not cover:

Computer virus detection and handling

G06F 21/56

Protecting computer platforms against harmful, malicious or unexpected behaviour or activities using intrusion detection and counter measures

G06F 21/566

{Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used (error monitoring in general G06F 11/30; arrangements for monitoring conditions of receiving stations in broadcast systems H04H 60/32; diagnosis, testing or measuring for television receivers H04N 17/04)}
Definition statement

This place covers:

  • The client monitors if all its components, internal processes are running properly and reports possible troubles.
  • CPU and memory load, processing speed, buffer (other than decoder buffer), timer, counter, percentage of the hard disk space used, authentication of internal components.
References
Limiting references

This place does not cover:

Diagnosis, testing or measuring for television receivers

H04N 17/04

Error monitoring in general

G06F 11/30

Arrangements for monitoring conditions of receiving stations in broadcast systems

H04H 60/32

Informative references

Attention is drawn to the following places, which may be of interest for search:

Monitoring or testing of receivers with feedback of measurements to the transmitter

H04B 17/24

{Monitoring the upstream path of the transmission network, e.g. its availability, bandwidth}
Definition statement

This place covers:

Monitoring of the upstream connection, e.g. its availability or bandwidth.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Measuring or estimating channel quality parameters

H04B 17/309

Monitoring of client processing errors or hardware failure
Definition statement

This place covers:

Monitoring of errors related e.g. to content uploading, demultiplexing or due to hardware failure.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Monitoring in electrical digital data processing

G06F 11/00

Error detection in general

G06F 11/07

Monitoring in general

G06F 11/30

OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
Definition statement

This place covers:

  • Operating system aspects.
  • Basic functions provided by the operating system like memory management, event handling or details of dedicated software libraries.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Boot device selection; Loading of operating system

G06F 9/4406

Arrangements for program loading or initiating

G06F 9/445

Program loading or initiating in general using non-volatile memory from which the program can be directly executed

G06F 9/44568

{characterized by the use of Application Program Interface [API] libraries}
Definition statement

This place covers:

Details of dedicated software libraries or APIs.

{Powering on the client, e.g. bootstrap loading using setup parameters being stored locally or received from the server (resetting in general G06F 1/14; program loading or initiating in general G06F 9/445; bootstrapping in general G06F 9/4401; secure boots of computer platforms G06F 21/57)}
Definition statement

This place covers:

Setup parameters can be stored locally or received from the server.

Describes the action of powering on or booting the client device.

References
Limiting references

This place does not cover:

Resetting in general

G06F 1/14

Bootstrapping in general

G06F 9/4401

Program loading or initiating in general

G06F 9/445

Secure boots of computer platforms

G06F 21/57

{Memory management (allocation of memory to service a request G06F 9/5016; addressing or allocating within memory systems or architectures G06F 12/02)}
Definition statement

This place covers:

Details of memory access. Pertains only to the RAM and not the hard disk.

References
Limiting references

This place does not cover:

Allocation of memory to service a request

G06F 9/5016

Addressing or allocating within memory systems or architectures

G06F 12/02

{Power management, e.g. shutting down unused components of the receiver (power management in computer systems G06F 1/3203; hibernate or awake process in computer systems G06F 9/4418)}
Definition statement

This place covers:

Battery power management of the receiver, e.g. DVB-H, stand-by mode or shutting down unused parts of the receiver.

References
Limiting references

This place does not cover:

Power management in computer systems

G06F 1/3203

Hibernate or awake process in computer systems

G06F 9/4418

{Implementing a Virtual Machine [VM] (virtual machines in general G06F 9/45533)}
Definition statement

This place covers:

Presence or details of the implementation of a virtual machine.

References
Limiting references

This place does not cover:

Virtual machines in general

G06F 9/45533

{Window management, e.g. event handling following interaction with the user interface}
Definition statement

This place covers:

  • A window manager represents a technical evolution with respect to older techniques of displaying non video data on a screen such as PiP or OSD.
  • The creation, management of windows or drawing primitives and generally speaking the management of the interaction with a GUI including event handling.
Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
Definition statement

This place covers:

System management. This layer describes high-level functions of the multimedia client, but which are still transparent for the end-user.

{Management of client data or end-user data}
Definition statement

This place covers:

Management functions implemented in the client device.

{involving client characteristics, e.g. Set-Top-Box type, software version or amount of memory available}
Definition statement

This place covers:

The hardware profile describing the processing capabilities of the client is used to discard data streams, which the client can not handle and to retrieve software modules or streams compatible with its capabilities.

It covers hardware and software resources, like the version of the software installed.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Allocation of resources considering hardware capabilities

G06F 9/5044

Allocation of resources considering software capabilities

G06F 9/5055

Message adaptation based on network or terminal capabilities in packet switching networks

H04L 51/06

Terminal profiles in network data switching protocols

H04L 67/303

Processing of terminal status or physical abilities in wireless networks

H04W 8/22

{involving the geographical location of the client (retrieval from the Internet by querying based on geographical locations G06F 16/9537; systems specially adapted for using geographical information in broadcast systems H04H 60/70; protocols in which the network application is adapted for the location of the user terminal in communication control or processing H04L 67/52; services making use of the location of users or terminals in wireless networks H04W 4/02; locating users or terminals in wireless networks H04W 64/00)}
Definition statement

This place covers:

The geographical position of the client, which can be a regional or ZIP code for a fixed client or data provided by a GPS for a mobile client is used to provide the user with information related to its geographical environment, e.g. regional news or ads.

Example(s) of documents found in this subgroup: US 6,948,183 B1

References
Limiting references

This place does not cover:

Retrieval from the Internet by querying based on geographical locations

G06F 16/9537

Systems specially adapted for using geographical information in broadcast systems

H04H 60/70

Protocols in which the network application is adapted for the location of the user terminal in communication control or processing

H04L 67/52

Services making use of the location of users or terminals in wireless networks

H04W 4/02

Locating users or terminals in wireless networks

H04W 64/00

{involving end-user characteristics, e.g. viewer profile, preferences (monitoring of user activities for profile generation for accessing a video database G06F 16/739; user profiles in network data switching protocols H04L 67/306; processing of user preferences or user profiles in wireless networks H04W 8/18)}
Definition statement

This place covers:

The viewer profile is either compiled from history data or defined explicitely by the user or received from the server as demographic data.

Example(s) of documents found in this subgroup: WO 2006/1296988 A1

References
Limiting references

This place does not cover:

Monitoring of user activities for profile generation for accessing a video database

G06F 16/735

User profiles in network data switching protocols

H04L 67/306

Processing of user preferences or user profiles in wireless networks

H04W 8/18

Content {or additional data} filtering, e.g. blocking advertisements
Definition statement

This place covers:

The server sends the same content to a plurality of clients as it does not have any prior knowledge of their requirements. The filtering module will extract the part relevant to the client according to criteria.Advanced filtering systems use learning algorithms to adapt the criteria according to explicit user inputs and/or monitored data. Details of image filtering are described elsewhere.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Filtering and selective blocking of messages over packet-switching networks

H04L 63/0227

{Blocking scenes or portions of the received content, e.g. censoring scenes}
Definition statement

This place covers:

Usually the filter will extract a small portion of the data from the incoming stream. As this operation is trivial, it does not need to be described. However, some applications consider removing a small part of the information from the incoming stream. Examples can be censoring of scenes, image regions or blocking of advertisements.

Input to filtering algorithms, e.g. filtering a region of the image
Definition statement

This place covers:

Once the filtering criterion is defined, the filter needs to know on which kind of streaming or additional data it has to apply.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Filtering for image enhancement or restoration

G06T 5/00

{applied to an object-based stream, e.g. MPEG-4 streams}
Definition statement

This place covers:

The filter could process MPEG-4 streams, for example to delete some objects.

{applied to a region of the image}
Definition statement

This place covers:

A region of Interest is defined and displayed, blurred or masked. This can be applied to analog or MPEG-2 streams, where the image has been encoded as a whole and not as a set of objects. Therefore, the region is marked and tracked. Censoring systems can require to mask or blur some regions of the image.

{applied to a time segment}
Definition statement

This place covers:

A time segment of the video is filtered out.

Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules {; time-related management operations (arrangements for replacing or switching information during the broadcast or during the distribution H04H 20/10)}
Definition statement

This place covers:

The scheduler has a similar function to the scheduler on the server side. It processes incoming streams of data as well as data cached on an internal disk and creates virtual channels. It can also be controlled by the server. It can generate a stream of personalized content.

References
Limiting references

This place does not cover:

Arrangements for replacing or switching information during the broadcast or during the distribution

H04H 20/10

{Automatically resolving scheduling conflicts, e.g. when a recording by reservation has been programmed for two programs in the same time slot}
Definition statement

This place covers:

The client solves automatically conflicts in scheduling issues, like having to perform two operations at the same time, e.g. recording two different movies in the same time slot.

{Content update operation triggered locally, e.g. by comparing the version of software modules in a DVB carousel to the version stored locally (deployment, distribution, installation, update of software G06F 8/65; error detection or correction of the data by redundancy during software upgrading G06F 11/1433; arrangements for updating broadcast information or broadcast-related information H04H 60/25)}
Definition statement

This place covers:

The client checks itself if an update operation needs to be performed. This could be implemented by comparing the version of software modules in a DVB carousel with the local version.

References
Limiting references

This place does not cover:

Deployment, distribution, installation, update of software

G06F 8/65

Error detection or correction of the data by redundancy during software upgrading

G06F 11/1433

Arrangements for updating broadcast information or broadcast-related information

H04H 60/25

Informative references

Attention is drawn to the following places, which may be of interest for search:

Program updating while running in general

G06F 8/656

Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
Definition statement

This place covers:

Management functions implemented in the client device.

{Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen (arrangements for using the results of monitoring on user's side in broadcast systems H04H 60/65; flow control in packet networks H04L 47/10)}
Definition statement

This place covers:

Scalability control is performed by the client device, for example to forward the data to a low-resolution device on a home network.

References
Limiting references

This place does not cover:

Arrangements for using the results of monitoring on user's side in broadcast systems

H04H 60/65

Flow control in packet networks

H04L 47/10

{Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet (web site content organization and management for information retrieval from the Internet G06F 16/958; transmission by internet of broadcast information H04H 60/82; stock exchange data over packet-switching network H04L 12/1804; push services including data channel over packet-switching network H04L 12/1859)}
Definition statement

This place covers:

  • The client combines data received from different sources, e.g. EPG data from cable operators, satellite services, internet or internally stored.
  • Describes also the connection to the same source via different networks, e.g. a part of the content is distributed via TV broadcast and another one via internet.
References
Limiting references

This place does not cover:

Web site content organization and management for information retrieval from the Internet

G06F 16/958

Transmission by internet of broadcast information

H04H 60/82

Stock exchange data over packet-switching network

H04L 12/1804

Push services including data channel over packet-switching network

H04L 12/1859

Processing of entitlement messages, e.g. ECM [Entitlement Control Message] or EMM [Entitlement Management Message] {(arrangements for conditional access to broadcast information or to broadcast-related services H04H 60/14)}
Definition statement

This place covers:

Processing of the ECM, EMM messages received from the server. Details of the descrambling are found elsewhere.

References
Limiting references

This place does not cover:

Arrangements for conditional access to broadcast information or to broadcast-related services

H04H 60/14

Rights management {associated to the content (security in data switching network management H04L 41/28; security management or policies for network security H04L 63/20; access security in wireless networks H04W 12/08)}
Definition statement

This place covers:

Described is here the management of the rights attached to the content. It retrieves the rights associated with the content. The rights of the different users are defined using an application described elsewhere.

References
Limiting references

This place does not cover:

Security in data switching network management

H04L 41/28

Security management or policies for network security

H04L 63/20

Access security in wireless networks

H04W 12/08

Informative references

Attention is drawn to the following places, which may be of interest for search:

Generation of protective data, e.g. certificates

H04N 21/835

Protecting software against unauthorised usage in a vending or licensing environment

G06F 21/10

Learning process for intelligent management, e.g. learning user preferences for recommending movies {(services using the results of monitoring in broadcast systems H04H 60/61)}
Definition statement

This place covers:

The agent is an intelligent system, which learns and tries to adapt its output to its inputs. It receives input data directly from the viewer (explicit profile) via a corresponding user interface (e.g. movie ratings) as well as from the monitoring module (implicit profile). Its output can be a control signal to the filter or a recommendation list, which will be displayed in a corresponding user interface. Learning can be implemented using one of the method described below, but can also be a combination of several methods.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Monitoring of user activities for profile generation for accessing a video database

G06F 16/739

Computer systems using learning methods

G06N 3/08

Services using the results of monitoring in broadcast systems

H04H 60/61

{Deriving a combined profile for a plurality of end-users of the same client, e.g. for family members within a home (user profiles in network data switching protocols H04L 67/306)}
Definition statement

This place covers:

At the opposite of a case-based agent, a collaborative system is based on the similarity between user profiles. A user is compared to other users and if it is found as an example that he belongs to a user group, the recommendation list of this group will be used for him. As this system is implemented on the client side, only documents related to user profiles on the same client (stored locally) or profiles from other clients but provided by the server to the concerned client should be classified here.

Example(s) of documents found in this subgroup: WO 03/043337 A1

References
Limiting references

This place does not cover:

User profiles in network data switching protocols

H04L 67/306

Informative references

Attention is drawn to the following places, which may be of interest for search:

Deriving collaborative data from a large group of end-users on the server

H04N 21/252

{characterized by learning algorithms}
Definition statement

This place covers:

Types of learning method used by the agent.

{involving probabilistic networks, e.g. Bayesian networks}
Definition statement

This place covers:

Bayesian (probabilistic) networks are used.

{involving classification methods, e.g. Decision trees}
Definition statement

This place covers:

Decision trees or any type or classifiers are used.

{using neural networks, e.g. processing the feedback provided by the user}
Definition statement

This place covers:

The agent will try to match its output to the feedback provided by the user using a neural network. The learning process requires several iterations to converge.

{Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections}
Definition statement

This place covers:

  • User selections are recorded in a history file, i.e. describing processing operations of the history, e.g. trend analysis or clustering.
  • The generation of the user profile, if disclosed explicitly. The dynamic adaptation of the profile is performed by an intelligent agent.
{for recommending content, e.g. movies}
Definition statement

This place covers:

Generation of a recommendation or suggestion list.

End-user applications
Definition statement

This place covers:

End user applications in the sense of services provided by the multimedia system to the users. There are basically two categories of applications: the ones providing local interactivity and the ones requiring an uplink.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Receiver circuitry for displaying additional information

H04N 5/445

Interaction techniques for graphical user interfaces

G06F 3/048

Software engineering for user interfaces

G06F 8/20

Services or applications for real-time multimedia communications

H04L 65/40

End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content {(end-user interfaces for retrieving video data from a database G06F 16/739; network services for supporting unicast streaming H04L 65/612)}
Definition statement

This place covers:

A request application allows the user to request a program or any additional information. Covers all on-request applications. The request may be fullfilled immediately, with a small delay or later in the future. The headgroup covers also requests for downloading music.

Example(s) of documents found in this subgroup: EP 1 947 855 A1

References
Limiting references

This place does not cover:

End-user interfaces for retrieving video data from a database

G06F 16/7335

Network services for supporting unicast streaming

H04L 65/612

{for requesting content on demand, e.g. video on demand}
Definition statement

This place covers:

True VOD systems allowing to request and receive a movie within a short delay. Therefore, the movie will be streamed only to the requesting user or it will be available on a multicast channel. This group covers also details of the menu to stop, pause, FFWD, RWD or play a movie.

{for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally}
Definition statement

This place covers:

  • Users interacting with MPEG-4 objects.
  • Editing by the end-user on the client device.
{for requesting near-video-on-demand content}
Definition statement

This place covers:

Movies are sent on a regular basis with a time offset (staggered) on different broadcast channels.

{for requesting pay-per-view content (payment schemes payment architectures or payment protocols G06Q 20/00, G07F)}
Definition statement

This place covers:

Broadcast programs not being NVOD associated to a request for purchasing, e.g. free preview programs.

References
Limiting references

This place does not cover:

Payment schemes payment architectures or payment protocols

G06Q 20/00, G07F

{for content reservation or setting reminders; for requesting event notification, e.g. of sport results or stock market (stock exchange data over packet-switching network H04L 12/1804; push services over packet-switching network H04L 12/1859; notification of incoming messages in packet switching networks H04L 51/224)}
Definition statement

This place covers:

  • The viewer can mark a program displayed in an EPG for later viewing or recording. Pertains to the reservation of time, channel or a piece of content.
  • Bookmarking operations as well as the request for notification when an event has occurred, e.g. sport results or stock exchange above a given level.
References
Limiting references

This place does not cover:

Stock exchange data over packet-switching network

H04L 12/1804

Push services over packet-switching network

H04L 12/1859

Notification of incoming messages in packet switching networks

H04L 51/224

{for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks (specific graphical features in visual interfaces H04N 21/4312)}
References
Limiting references

This place does not cover:

Specific graphical features in visual interfaces

H04N 21/4312

for requesting additional data associated with the content
Definition statement

This place covers:

The user requests actively for additional data, e.g. by pressing a button on a remote control, when an icon signaling the presence of interactive content is displayed on the screen.

using interactive regions of the image, e.g. hot spots {(details of information retrieval from the Internet by using URLs G06F 16/955; processing chained hypermedia data for information retrieval G06F 16/94)}
Definition statement

This place covers:

Additional data is accessed by clicking on a hotspot

References
Limiting references

This place does not cover:

Processing chained hypermedia data for information retrieval

G06F 16/94

Details of information retrieval from the Internet by using URLs

G06F 16/955

for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
Definition statement

This place covers:

Manual selection of a portion of the displayed frame on the screen by the user.

End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
Definition statement

This place covers:

Profile applications allow to define parameters, which will control the viewing experience of the viewer.

{for defining user accounts, e.g. accounts for children}
Definition statement

This place covers:

Master users, e.g. parents, defining several accounts for the users of the client, e.g. for children.

{for user identification, e.g. by entering a PIN or password (cryptographic authentication protocols H04L 9/32; networks authentication protocols H04L 63/08)}
Definition statement

This place covers:

The user identifies himself to the client by entering a password or a PIN number. Passive identification is to be found elsewhere.

References
Limiting references

This place does not cover:

Cryptographic authentication protocols

H04L 9/32

Network authentication protocols

H04L 63/08

{for defining user preferences, e.g. favourite actors or genre (retrieval personalisation and generation of user profiles for the retrieval of video data G06F 16/739; user profiles in network data switching protocols H04L 67/306)}
Definition statement

This place covers:

The user enters for example his favorite channels, actors, directors, program genre or just a rating level (as used with a V-chip). Covers menus for parental control in general.

Example(s) of documents found in this subgroup: US 2002/0140728 A1

References
Limiting references

This place does not cover:

Retrieval personalisation and generation of user profiles for the retrieval of video data

G06F 16/735

User profiles in network data switching protocols

H04L 67/306

{for rating content, e.g. scoring a recommended movie}
Definition statement

This place covers:

This application is required for example by the agent module during its learning phase. Items are displayed on the screen and the user is requested to provide a rating.

{for providing answers, e.g. voting}
Definition statement

This place covers:

Questions and answers. It can be used to poll users about their opinion regarding a problem raised during the TV broadcast, to react on an advertisement or in a TV quiz. This group also covers voting.

Supplemental services, e.g. displaying phone caller identification, shopping application
Definition statement

This place covers:

Described are here applications, which are provided as additional services to the users but do not belong to the core services provided in a multimedia system.

{Electronic banking (banking in general G06Q 30/02)}
Definition statement

This place covers:

On-line banking application including the trading of stocks.

References
Limiting references

This place does not cover:

Banking in general

G06Q 30/02

{Games}
Definition statement

This place covers:

Only games, which do not interact with the video stream, e.g. MPEG-4 based games, and are not of a question and answer type, e.g. quiz. They can be local or played with remote opponents.

{Electronic shopping (payment schemes, payment architectures or payment protocols for electronic shopping systems G06Q 20/12)}
Definition statement

This place covers:

TV home-shopping applications, also requesting quotes for services, excludes the request for additional data.

References
Limiting references

This place does not cover:

Payment schemes, payment architectures or payment protocols for electronic shopping systems

G06Q 20/12

Web browsing {, e.g. WebTV}
Definition statement

This place covers:

The TV terminal is used as a WWW browser (e.g. WebTV) to display WWW pages. It must not be confused with systems where video or program related data is retrieved from the Internet without active browsing by the user. This place should not be used either for PC's with an internet connection.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Retrieval from the web

G06F 16/95

Web-based protocols

H04L 67/02

receiving rewards
Definition statement

This place covers:

Users receive awards, coupons, prizes, points or air miles.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Payment schemes, architectures or protocols

G06Q 20/00

E-commerce

G06Q 30/00

Charging arrangements in data networks

H04L 12/14

e-mailing {(message switching systems, e.g. electronic mail systems H04L 51/00)}
Definition statement

This place covers:

E-mail application as known from computer systems but implemented on a Set-Top-Box or TV receiver.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Message switching systems, e.g. electronic mail systems

H04L 51/00

Wireless messaging

H04W 4/12

communicating with other users, e.g. chatting {(arrangements for providing for computer conferences, e.g. chat rooms, to substation in data switching networks H04L 12/1813; distributed application using peer-to-peer [P2P] networks H04L 67/104)}
Definition statement

This place covers:

Systems allowing users from distinct clients to communicate with each other, for example to exchange videos or any kind of data but no E-mails. Covers chat applications, bulletin board, forum.

References
Limiting references

This place does not cover:

Arrangements for providing for computer conferences, e.g. chat rooms, to substation in data switching networks

H04L 12/1813

Distributed application using peer-to-peer [P2P] networks

H04L 67/104

End-user interface for program selection {(broadcast systems using EPGs H04H 60/72)}
Definition statement

This place covers:

Selection menus allowing the user to select actively a piece of content from a plurality, e.g. a function provided by an Electronic Program Guide.

References
Limiting references

This place does not cover:

Broadcast systems using EPGs

H04H 60/72

{using a grid, e.g. sorted out by channel and broadcast time}
Definition statement

This place covers:

Programs are displayed in a grid, sorted out by channel and broadcast time.

{using a channel name}
Definition statement

This place covers:

Channels are selected by entering their name instead of number.

{using recommendation lists, e.g. of programs or channels sorted out according to their score}
Definition statement

This place covers:

A recommendation list of desirable items has been compiled by the agent module and is displayed to the user. It is mostly an ordered list, where are items are sorted out according to their score, which may be also displayed next to the item descriptor. Items can be programs or channels.

{for searching program descriptors (retrieval of video data G06F 16/739)}
Definition statement

This place covers:

The application provides a search function, for example using keywords to retrieve an actor's name.

References
Limiting references

This place does not cover:

Retrieval of video data

G06F 16/739

End-user interface for client configuration
Definition statement

This place covers:

Configuration applications allow the user to define the settings of the client.

{for modifying image parameters, e.g. image brightness, contrast}
Definition statement

This place covers:

Image brightness, contrast, setting of the color channels.

{for language selection, e.g. for the menu or subtitles}
Definition statement

This place covers:

Language selection for e.g. configuration or setup menus or subtitles.

{for modifying screen layout parameters, e.g. fonts, size of the windows}
Definition statement

This place covers:

Layout parameters such as colors, fonts, size of the windows.

Data services, e.g. news ticker {(systems specially adapted for using meteorological information in broadcast systems H04H 60/71)}
Definition statement

This place covers:

Presentation of information and data services. Classified should be here only applications pertaining to the display of such data.

References
Limiting references

This place does not cover:

Systems specially adapted for using meteorological information in broadcast systems

H04H 60/71

{for displaying messages, e.g. warnings, reminders (arrangements for providing short real-time information to substation in data switching networks H04L 12/1895)}
Definition statement

This place covers:

  • Display of warnings or reminders. Pertains usually to textual or graphical information, which is displayed for a brief period of time.
  • Download status bar.
References
Limiting references

This place does not cover:

Arrangements for providing short real-time information to substation in data switching networks

H04L 12/1895

{for displaying subtitles}
Definition statement

This place covers:

Subtitles or closed-caption.

{for displaying a ticker, e.g. scrolling banner for news, stock exchange, weather data}
Definition statement

This place covers:

News, stock exchange, weather data are displayed as a scrolling banner on the screen.

{for displaying teletext characters}
Definition statement

This place covers:

Teletext service.

Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client {, e.g. sending from server to client commands for recording incoming content stream}; Communication details between server and client 
Definition statement

This place covers:

  • Subject matter comprising processes and structure involving exchange of data and control signals between servers, clients and intermediate components connected within a network(s). These processes and structure generally involve communication between intermediate components, the network interface of servers and that of the clients, resulting in data or control signals being exchanged there between in a particular manner and/or for a particular purpose and processing operations performed by the network itself.
  • The first layer is directed to the physical description of the network such as the nature of the network used for the downlink and uplink connection (e.g. satellite, cable, internet, GSM) and the components used for the transmission of the electromagnetic waves (e.g. taps, splitters, amplifiers).
  • The second layer describes communication aspects such as addressing (e.g. multicasting), the type of protocol used (e.g. DSM-CC, ATM). It also includes the exchange of low-level control signals originating from server (e.g. encryption key), client or network as well as processing operations by the network itself (e.g. protocol conversion, dropping of packets).
  • The third layer pertains to the exchange of high-level control signals originating from client and transmitted to the server (e.g. viewing history, VOD control parameters), or issued by the server and sent to the client (e.g. trigger recording, channel tuning or sending setup parameters).
  • The subgroup is directed to the invention information including data transactions necessitating communication between server, client and network. Documents where the invention information is related to the transmitter or receiver per se are placed in the S or C models, respectively.

Examples of documents placed in H04N 21/60 (1) The subgroup is directed to the invention information including physical level components of the distribution model. The physical level components of the T-model preclude physical components of the server and client models. For example, documents where the invention information includes physical level components (such as taps, splitters, amplifiers, etc) are placed in H04N 21/6106, whereas server physical components (such as modulators, multiplexers, etc) are placed in H04N 21/21, similarly client physical components (such as tuners, demultiplexers, etc) are placed in H04N 21/41. (2) This subgroup is directed to describing the type of protocol used (e.g. ATM) for transport on a specific type of network. The actual adaptation process is described elsewhere

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Cryptographic protocols

H04L 9/00

Data switching networks

H04L 12/00

Network security protocols

H04L 63/00

Network arrangements or protocols for real-time communications

H04L 65/00

Media handling, encoding, streaming or conversion

H04L 65/60

Network protocols for data switching network services

H04L 67/01

Protocols for client-server architecture

H04L 67/01

Wireless communication networks

H04W

Network physical structure; Signal processing (H04B takes precedence)
Definition statement

This place covers:

Physical level and network topology.

References
Limiting references

This place does not cover:

Transmission

H04B

{specially adapted to the downstream path of the transmission network}
Definition statement

This place covers:

Type of the downlink connection. The first 3 are trivial and should only be used if they play a significant part in the description of a document.

{involving terrestrial transmission, e.g. DVB-T}
Definition statement

This place covers:

Terrestrial transmission. Preferrably DVB-T documents should be classified here.

{involving transmission via Internet (transmission by internet of broadcast information H04H 60/82)}
Definition statement

This place covers:

  • Typically video streaming via Internet. It must not be confused with browsers or WWW servers providing additional data.
  • Computer networks in general (e.g. ATM).
References
Limiting references

This place does not cover:

Transmission by internet of broadcast information

H04H 60/82

Informative references

Attention is drawn to the following places, which may be of interest for search:

IP communication protocol

H04N 21/64322

{involving transmission via a mobile phone network (wireless downlink channel access H04W 74/006)}
Definition statement

This place covers:

Multimedia data are transported over a mobile phone network. Excluded are here wireless transmission in home networks but covers wireless wide area networks and wireless connection to a public access point.

References
Limiting references

This place does not cover:

Wireless downlink channel access

H04W 74/006

{involving transmission via a telephone network, e.g. POTS}
Definition statement

This place covers:

Video over a phone line. Excludes xDSL-type connections described elsewhere. Using the H.223 multiplexing and H.245 control standards

{Signal processing at physical level (signal processing in analog two-way television systems H04N 7/173)}
Definition statement

This place covers:

Details of signal processing at lowest ISO level.

References
Limiting references

This place does not cover:

Signal processing in analog two-way television systems

H04N 7/173

{specially adapted to the upstream path of the transmission network}
Definition statement

This place covers:

Type of the uplink connection.

{involving terrestrial transmission, e.g. DVB-T}
Definition statement

This place covers:

Terrestrial transmission. Preferrably DVB-T documents should be classified here.

{involving cable transmission, e.g. using a cable modem}
Definition statement

This place covers:

Typically for uplinks using a cable modem.

{involving transmission via Internet (broadcast-related systems characterised by the transmission system being the Internet H04H 60/82)}
References
Limiting references

This place does not cover:

Broadcast-related systems characterised by the transmission system being the Internet

H04H 60/82

{involving transmission via a mobile phone network (arrangements for providing broadcast or conference services to substation in data switching networks in combination with wireless systems H04L 12/189; wireless uplink channel access H04W 74/004)}
Definition statement

This place covers:

Low-cost clients do not provide a return channel. The uplink can still be established if the user owns a mobile phone. Excluded are here wireless transmission in home networks but covers wireless wide area networks and wireless connection to a public access point.

References
Limiting references

This place does not cover:

Arrangements for providing broadcast or conference services to substation in data switching networks in combination with wireless systems

H04L 12/189

Wireless uplink channel access

H04W 74/004

{involving transmission via a telephone network, e.g. POTS}
Definition statement

This place covers:

Uplink using an analog phone modem for example 2-way TV systems.

{involving transmission via a satellite (arrangements for data linking, networking or transporting, or for controlling an end to end session in a satellite broadcast system H04B 7/18526)}
References
Limiting references

This place does not cover:

Arrangements for data linking, networking or transporting, or for controlling an end to end session in a satellite broadcast system

H04B 7/18526

Control signaling {related to video distribution} between client, server and network components; Network processes for video distribution between server and clients {or between remote clients}, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing (real-time session protocols H04L 65/1101; distributed application using peer-to-peer [P2P] networks H04L 67/104)
Definition statement

This place covers:

Low-level control signals exchanged between client and server.

References
Limiting references

This place does not cover:

Real-time session protocols

H04L 65/1101

Distributed application using peer-to-peer [P2P] networks

H04L 67/104

{Multimode Transmission, e.g. transmitting basic layers and enhancement layers of the content over different transmission paths or transmitting with different error corrections, different keys or with different transmission protocols}
Definition statement

This place covers:

Transmission of structured video content (i.e. video which is provided as layers, different versions, objects etc.) over different transmission paths or with different error corrections, different scrambling keys, using different transmission protocols etc. It describes the presence of different transmission "conditions" for the layers, versions or objects. Application example: A high error protection and a network with good QoS is used for important video layers, less important layers are sent with no error correction on a more error prone network.

{using a connection between clients on a wide area network, e.g. setting up a peer-to-peer communication via Internet for retrieving video segments from the hard-disk of other client devices (broadcast-related systems characterised by transmission among terminal devices H04H 60/80; distributed application using peer-to-peer [P2P] networks H04L 67/104)}
Definition statement

This place covers:

Content can be accessed on the storage medium of client devices, e.g. parts of a movie can be retrieved from the hard disk of other users, instead of using the cache of a server. Chatting applications are classified elsewhere.

References
Limiting references

This place does not cover:

Broadcast-related systems characterised by transmission among terminal devices

H04H 60/80

Distributed application in data packet switching network using peer-to-peer [P2P] networks

H04L 67/104

Control signals issued by server directed to the network components or client {(management of faults, events, alarms in data networks H04L 41/06)}
Definition statement

This place covers:

Server-side control signals. Described are here low-level control signals issued by the server for controlling the network or the client.

References
Limiting references

This place does not cover:

Management of faults, events, alarms in data networks

H04L 41/06

for authorisation, e.g. by transmitting a key {(wireless communications network key management H04W 12/04; wireless communications network access security H04W 12/08)}
References
Limiting references

This place does not cover:

Wireless communications network key management

H04W 12/04

Wireless communications network access security

H04W 12/08

Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements for secret or secure communication

H04L 9/00

{by transmitting keys (key distribution for secret or secure communication H04L 9/08; network support of key management H04L 63/06)}
Definition statement

This place covers:

Download of keys, for example transmission of ECM/EMM in a conditional access system.

References
Limiting references

This place does not cover:

Key distribution for secret or secure communication

H04L 9/08

Network support of key management

H04L 63/06

directed to decoder
Definition statement

This place covers:

Control of the video decoder by the server

Control signals issued by the client directed to the server or network components
Definition statement

This place covers:

Control signals sent by the client device, e.g. for controlling the network or the server.

directed to network
Definition statement

This place covers:

Control signals sent by the client device, e.g. for controlling the network or the server.

for rate control {, e.g. request to the server to modify its transmission rate (flow control in packet networks H04L 47/10)}
Definition statement

This place covers:

The client sends a control signal to the server (e.g. encoder) or to the network requesting a bitrate modification.

References
Limiting references

This place does not cover:

Flow control in packet networks

H04L 47/10

for requesting retransmission {, e.g. of data packets lost or corrupted during transmission from server} (ARQ protocols H04L 1/18; implementation or adaptation of Internet protocol [IP], of transmission control protocol [TCP] or of user datagram protocol [UDP] H04L 69/16)
Definition statement

This place covers:

The client asks the server or the network to retransmit some data packets that have been lost or corrupted.

References
Limiting references

This place does not cover:

ARQ protocols

H04L 1/18

Implementation or adaptation of Internet protocol [IP], of transmission control protocol [TCP] or of user datagram protocol [UDP]

H04L 69/16

directed to server {(one-way streaming services wherein the source is controlled by the destination H04L 65/613)}
Definition statement

This place covers:

Control signals issued by the client to the server.

References
Limiting references

This place does not cover:

One-way streaming services wherein the source is controlled by the destination

H04L 65/613

{for uploading keys, e.g. for a client to communicate its public key to the server (key management H04L 9/08; network support of key management H04L 63/06)}
Definition statement

This place covers:

Describes clients communicating public key to the server.

References
Limiting references

This place does not cover:

Key management

H04L 9/08

Network support of key management

H04L 63/06

directed to encoder {, e.g. for requesting a lower encoding rate}
Definition statement

This place covers:

Control of the video encoder, also requests for transcoding.

Addressing {(network arrangements, protocols or services for addressing or naming H04L 61/00; support for multicast or broadcast of one-way stream services H04L 65/611)}
References
Limiting references

This place does not cover:

Network arrangements in data packet switching network, protocols or services for addressing or naming

H04L 61/00

Support for multicast or broadcast of one-way stream services in data packet switching network

H04L 65/611

Address allocation for clients {(address allocation in data networks H04L 61/50)}
Definition statement

This place covers:

Describes the process of allocating adresses to the clients

References
Limiting references

This place does not cover:

Address allocation in data networks

H04L 61/50

Multicasting {(data broadcast and multicast in packet switching networks H04L 12/18)}
Definition statement

This place covers:

Data is sent to a group of clients.

References
Limiting references

This place does not cover:

Data broadcast and multicast in packet switching networks

H04L 12/18

Unicasting
Definition statement

This place covers:

Data is sent to only one client on a dedicated channel.

Communication protocols {(network streaming protocols, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP], H04L 65/65)}
Definition statement

This place covers:

Details of protocols are classified elsewhere.

References
Limiting references

This place does not cover:

Network streaming protocols in data packet switching network, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP]

H04L 65/65

{ATM}
Definition statement

This place covers:

Adaptation of MPEG packets for the transport on the ATM network.

{IP}
Definition statement

This place covers:

Similar for the adaptation to an IP network.

Digital Storage Media - Command and Control Protocol [DSM-CC]
Definition statement

This place covers:

DSM-CC has been designed for MPEG systems.

Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless {(real-time session protocols H04L 65/1101)}
Definition statement

This place covers:

  • Network control and processing.
  • Low-level control signals issued by the network for controlling the server or the client as well as operations performed by the network on the content.
References
Limiting references

This place does not cover:

Real-time session protocols in data packet switching network

H04L 65/1101

{for transferring content from a first network to a second network, e.g. between IP and wireless}
Definition statement

This place covers:

The input signal is routed during the transport to a different network, for example the video stream is sent by the server on an IP network and received by the client via a wireless network.

{Protecting content from unauthorized alteration within the network (verifying the information received for network security in communication control or processing H04L 63/12; integrity in wireless network security H04W 12/10)}
Definition statement

This place covers:

Additional measures to protect the data from forbidden alterations during the transport.

References
Limiting references

This place does not cover:

Verifying the information received for network security in communication control or processing

H04L 63/12

Integrity in wireless network security

H04W 12/10

{Monitoring of network processes or resources, e.g. monitoring of network load (traffic related reporting in data switching networks H04L 43/062)}
References
Limiting references

This place does not cover:

Traffic related reporting in data switching networks

H04L 43/062

{Monitoring network processes errors (for recovering from a failure of a protocol instance or entity H04L 69/40)}
Definition statement

This place covers:

Monitoring of error during network processing.

References
Limiting references

This place does not cover:

Recovering in data packet switching network from a failure of a protocol instance or entity

H04L 69/40

{Monitoring network characteristics, e.g. bandwidth, congestion level (data switched network analysis H04L 41/14; monitoring functioning in data switched networks H04L 43/0817; flow control in packet networks H04L 47/10)}
Definition statement

This place covers:

Monitoring by the network of the congestion level, bandwidth, BER, status of the connection (dropped).

References
Limiting references

This place does not cover:

Data switched network analysis

H04L 41/14

Monitoring functioning in data switched networks

H04L 41/30

Flow control in packet networks

H04L 47/10

{Control signals issued by the network directed to the server or the client}
Definition statement

This place covers:

Network-side control signals.

{for rate control (flow control in packet networks H04L 47/10)}
Definition statement

This place covers:

The network sends a control signal to the server (e.g. encoder or pump) requesting a bitrate adaptation to the bandwidth.

References
Limiting references

This place does not cover:

Flow control in packet networks

H04L 47/10

{for requesting retransmission, e.g. of data packets lost or corrupted during transmission from server (ARQ protocols H04L 1/18; implementation or adaptation of Internet protocol [IP], of transmission control protocol [TCP] or of user datagram protocol [UDP] H04L 69/16)}
Definition statement

This place covers:

The network asks the server to retransmit some data packets that have been lost or corrupted.

References
Limiting references

This place does not cover:

ARQ protocols

H04L 1/18

Implementation or adaptation of Internet protocol [IP], of transmission control protocol [TCP] or of user datagram protocol [UDP]

H04L 69/16

{Data processing by the network (data processing in packet switching systems H04L 12/56; flow control in packet networks H04L 47/10; intermediate storage or scheduling H04L 49/90; provisioning of proxy services in data packet switching networks H04L 67/56)}
Definition statement

This place covers:

The data stream can be altered by the transport medium.

References
Limiting references

This place does not cover:

Data processing in packet switching systems

H04L 12/56

Flow control in packet networks

H04L 47/10

Intermediate storage or scheduling

H04L 49/90

Provisioning of proxy services in data packet switching networks

H04L 67/56

Informative references

Attention is drawn to the following places, which may be of interest for search:

Secondary or local servers, which could also alter the data

H04N 21/222

{Controlling the complexity of the content stream, e.g. by dropping packets (intermediate media network packet handling H04L 65/765; proxy provisioning conversion or adaptation for reducing the amount or size of exchanged application data H04L 67/5651; negotiation of resources in wireless networks H04W 28/16)}
Definition statement

This place covers:

The control of the complexity is performed on the network / within the transmission medium (e.g. routers drop packets)

References
Limiting references

This place does not cover:

Intermediate media network packet handling

H04L 65/765

Proxy provisioning conversion or adaptation for reducing the amount or size of exchanged application data

H04L 67/5651

Negotiation of resources in wireless networks

H04W 28/16

Transmission of management data between client and server
Definition statement

This place covers:

High-level control signals.

Transmission by server directed to the client
Definition statement

This place covers:

Server side Controlling. Described are here all the functions provided in a server for a high level control of the clients.

for forcing some client operations, e.g. recording {(remote booting in general G06F 9/4416)}
Definition statement

This place covers:

A further category is related to the actions, which the server forces the client to execute. Meant are channel tuning, retrieving from cache and inserting, recording, retrieving OS software from a carousel and upgrading, generating monitoring data, activating a trigger.

References
Limiting references

This place does not cover:

Remote booting in general

G06F 9/4416

comprising parameters, e.g. for client setup
Definition statement

This place covers:

It includes the download of system parameters, such as for the decoder, the display of the graphical user interface, the setup (including OS software) of the client.

Transmission by the client directed to the server
Definition statement

This place covers:

Client side Controlling. Nature of the uplink signal sent to the server.

{Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application}
Definition statement

This place covers:

It can also transmit reference data such as an URL, for accessing a WWW page, a movie ID for ordering a movie, a product ID for a home shopping application.

{Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number (arrangements where receivers interact with the broadcast H04H 20/38)}
Definition statement

This place covers:

The client can transmit stored data, like viewing habits, hardware capabilities, credit card number.

References
Limiting references

This place does not cover:

Arrangements where receivers interact with the broadcast

H04H 20/38

Acknowledgement
Definition statement

This place covers:

The client responds to an action triggered by the server, for example confirms that a download was successful.

Control parameters, e.g. trick play commands, viewpoint selection
Definition statement

This place covers:

The client sends parameters to control for example a VOD server (pause, fast-forward,..). Also includes viewpoint change, when an event is shot with different cameras.

Generation or processing of content or additional data by content creator independently of the distribution process; Content per se {(arrangements for generating broadcast information H04H 60/02)}
Definition statement

This place covers:

  • Subject matter comprising video data and data related or unrelated thereto, generated by the content provider, wherein the defining feature is the presence of the data per se or processing operations to convert the data into a form suitable for the distribution process or to create an interactive application. This subgroup is directed to raw multimedia objects and processing operations thereof, wherein the operations involved are independent of the distribution process. The resulting data is then provided to the server for distribution purposes. Processing operations dependent of the distribution process are placed in H04N 21/20, H04N 21/60, H04N 21/40, according to the entity (respectively server, network, client) performing the operation.
  • The first layer of this subgroup pertains to the nature of the raw multimedia content and covers e.g. video, audio, data, commercials, graphics and software.
  • The second layer describes processing functions such as protecting the content by adding e.g. a watermark, certificate, signature, identification or defining content usage, or adding metadata or structuring the content, e.g. by decomposing it into layers, objects and segments.
  • The next layer is directed to the assembling of the content, e.g. authoring of an interactive application. Examples of documents placed in the M-model (1) This subgroup is directed to the definition and generation of metadata. (2) This subgroup is directed to protection of rights and covers the identification of the source, content identification, rights specification (e.g. content can be displayed or copied within a certain time period or number of times and by a specific group of users) as well as adding certificates or calculating signatures. Scrambling of the content for transmission purposes are classified elsewhere. Systems that describe the blocking of specific video content transmitted over a network is classified elsewhere. (3) This subgroup is directed towards high-level tools or processes to generate a multimedia application from basic components (such as compiling an interactive application to be run on a target STB). It pertains e.g. to the design of the scene graph, the generation of a trailer, of timestamps, the packaging of the content into an XML file and the linking of multimedia objects to URLs.
References
Limiting references

This place does not cover:

Arrangements for generating broadcast information

H04H 60/02

Informative references

Attention is drawn to the following places, which may be of interest for search:

Compilation of EPG data containing metadata, also adding additional broadcast schedule data

H04N 21/26283

Monomedia components thereof
Definition statement

This place covers:

They are the basic monomedia components of multimedia content.Classifying these data types can be very useful to describe the kind of data processed in the system. This data will be distributed electronically later on (from a server to a client using a WAN, from the client to its peripherals using a LAN).

{involving special audio data, e.g. different tracks for different languages}
Definition statement

This place covers:

Audio. The audio component is usually present and related to the video component. Therefore, this place must be restricted to non trivial aspects such as for example the presence of several tracks for different languages.

{comprising music, e.g. song in MP3 format}
Definition statement

This place covers:

Music, songs, MP3 files. Distinct from the audio track of a movie.

{involving advertisement data (advertising per se G06Q 30/02)}
Definition statement

This place covers:

The commercial will be itself a mono-media or multimedia object, but may be considered as an external element, which is added to the original content for commercial purposes.

References
Limiting references

This place does not cover:

Advertising per se

G06Q 30/02

{involving additional data, e.g. news, sports, stocks, weather forecasts}
Definition statement

This place covers:

Data should be provided as an extra service in the multimedia distribution system, e.g. stocks, sport results, news tickers or weather information.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Operation of end-user applications for supplemental services

H04N 21/478

{specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program}
Definition statement

This place covers:

Additional data that are related to the multimedia content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program, etc...but no program descriptors, in the sense of metadata.

{comprising emergency warnings (arrangements specially adapted for emergency or urgency in broadcast systems H04H 20/59; arrangements for providing alarms, notifications, alerts to substation in data switching networks H04L 12/1895)}
References
Limiting references

This place does not cover:

Arrangements specially adapted for emergency or urgency in broadcast systems

H04H 20/59

Arrangements for providing alarms, notifications, alerts to substation in data switching networks

H04L 12/1895

{involving graphical data, e.g. 3D object, 2D graphics}
Definition statement

This place covers:

Graphical objects can be combined with video, for example, in MPEG-4. They can be of 2D or 3D nature. Text can also be considered as long as it is purely graphical and content of the textual information doesn't matter.

{comprising still images, e.g. texture, background image}
Definition statement

This place covers:

Still images like texture, background or any other to be used in a menu should be classified here.

{involving special video data, e.g 3D video}
Definition statement

This place covers:

  • Video is the main component in the area of interactive television and will normally be present in all documents. This entry should be thus only used to describe further details.
  • Motion vectors.
{involving executable data, e.g. software (arrangements for executing specific programs G06F 9/44; broadcasting computer programmes in broadcast systems H04H 20/91; involving the movement of software or configuration parameters H04L 67/34)}
Definition statement

This place covers:

Executable code can be sent for example to distribute commercial packages or upgrades to clients.

References
Limiting references

This place does not cover:

Arrangements for executing specific programs

G06F 9/44

Broadcasting computer programmes in broadcast systems

H04H 20/91

Involving the movement of software or configuration parameters in data packet switching networks

H04L 67/34

{End-user applications, e.g. Web browser, game}
Definition statement

This place covers:

High-level user applications, e.g. new browser, game to be run on the client only.

{OS software}
Definition statement

This place covers:

Software module for the STB operating system.

{specially adapted to be executed by a peripheral of the client device, e.g. by a reprogrammable remote control}
Definition statement

This place covers:

Software to be transmitted by the client to a peripheral such as PDA software. Covers also IR codes to reprogram a remote control.

{dedicated tools, e.g. video decoder software or IPMP tool}
Definition statement

This place covers:

STB tools, e.g. decoder software, realplayer, mediaplayer or IPMP tool.

Generation or processing of protective or descriptive data associated with content; Content structuring
Definition statement

This place covers:

Adding information Manipulating or adding information to the content to ensure its appropriate distribution.

Generation of protective data, e.g. certificates {(protecting software against unauthorised usage in a vending or licensing environment G06F 21/10)}
Definition statement

This place covers:

Identification of the source (e.g. motion picture studio), content identification, rights specification as well as adding certificates or calculating signatures to guarantee the integrity of the content and the rights of its provider. Protection is added at the generation of the content, before it enters the distribution system.

References
Limiting references

This place does not cover:

Protecting software against unauthorised usage in a vending or licensing environment

G06F 21/10

Informative references

Attention is drawn to the following places, which may be of interest for search:

Involving video stream encryption, e.g. re-encrypting a decrypted video stream for redistribution in a home network

H04N 21/4408

involving content or source identification data, e.g. Unique Material Identifier [UMID]
Definition statement

This place covers:

The content receives an identification number, e.g. UMID, describing for example a video clip number, the source (motion picture studio) it comes from.

involving usage data, e.g. number of copies or viewings allowed
Definition statement

This place covers:

  • The content provider defines how this content has to be used, e.g. if it can be displayed or copied and how often and by which group of users. This information is processed by the client-side rights manager or on the server-side rights management.
  • Covers also rental period of a movie.
{using a structured language for describing usage rules of the content, e.g. REL}
Definition statement

This place covers:

Structured language for describing usage rules of the content, i.e. REL.

involving watermark {(protecting executable software by watermarking G06F 21/16; image watermarking in general G06T 1/0021; watermarks inserted in still images for transmission purposes H04N 1/32144; inserting watermarks during video coding H04N 19/467)}
Definition statement

This place covers:

Watermarks being embedded in the content for later verification purposes.

References
Limiting references

This place does not cover:

Watermarks inserted in still images for transmission purposes

H04N 1/32144

Inserting watermarks during video coding

H04N 19/467

Protecting executable software by watermarking

G06F 21/16

Image watermarking in general

G06T 1/0021

Generation or processing of descriptive data, e.g. content descriptors {(systems specially adapted for using meta-information in broadcast systems H04H 60/73)}
Definition statement

This place covers:

  • Program descriptors, e.g. abstract or actors, as video specific metadata defined in MPEG-7. As metadata is a widely used word in a large range of applications, attention should be paid not to classify here aspects like identificators, watermarks or additional data.
  • Covers also program categories, reviews by other viewers and scene descriptors for MPEG-4 objects.
References
Limiting references

This place does not cover:

Systems specially adapted for using meta-information in broadcast systems

H04H 60/73

Informative references

Attention is drawn to the following places, which may be of interest for search:

Compilation of the EPG data as such by adding broadcast schedule data to metadata

H04N 21/26283

Supplemental data specifically related to the content

H04N 21/8133

{involving a version number, e.g. version number of EPG data (arrangements for version control in computers G06F 8/71)}
Definition statement

This place covers:

Version of the content, e.g. version of a software module.

References
Limiting references

This place does not cover:

Arrangements for version control in computers

G06F 8/71

represented by keywords
Definition statement

This place covers:

Metadata is available as keywords for quicker matching.

Structuring of content, e.g. decomposing content into time segments
Definition statement

This place covers:

Structuring of the content, for example by decomposing the content into layers, objects.

{using Advanced Video Coding [AVC]}
Definition statement

This place covers:

This place is used to indicate the presence of video structured as in the new coding standard Advanced Video Coding [AVC], also referred to in the literature as JVT, H.264, H.26L, MPEG-4 part 10 (misleading name as the video is NOT coded in object form as in MPEG-4 generally the case).

{by locking or enabling a set of features, e.g. optional functionalities in an executable program}
Definition statement

This place covers:

A piece of content has a set of features, which can be locked or enabled, e.g. optional functionalities in an executable program. Covers keyframes in video signals.

{involving pointers to the content, e.g. pointers to the I-frames of the video stream}
Definition statement

This place covers:

Entry points in the video stream.

{by decomposing the content in the time domain, e.g. in time segments}
Definition statement

This place covers:

A video stream is divided into time slices, e.g. segments or scenes.

Assembly of content; Generation of multimedia applications
Definition statement

This place covers:

Content assembly, performed typically by an operator on a work station in a production studio.

Content authoring
Definition statement

This place covers:

High-level tools or processes to generate a multimedia application from basic components. It compiles for example multimedia descriptors, e.g. MHEG, into an interactive application to be run on target STBs.

involving branching, e.g. to different story endings
Definition statement

This place covers:

Applications having several sub-scenarios, allowing different story developments.

using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML] {(information retrieval of semistructured data, the underlying structure being taken into account, e.g. mark-up language structure data G06F 16/80)}
Definition statement

This place covers:

Multimedia application described using a standard description language such as MHEG or XML.

References
Limiting references

This place does not cover:

Information retrieval of semistructured data, the underlying structure being taken into account, e.g. mark-up language structure data

G06F 16/80

for generating interactive applications
Definition statement

This place covers:

Generation of scripts or executable, e.g. applets, to make an application interactive.

involving timestamps for synchronizing content
Definition statement

This place covers:

Describes the generation of timestamps for synchronising different pieces of content such as video, audio or different objects.

Creating video summaries, e.g. movie trailer {(retrieval in video databases by using presentations in form of a video summary G06F 16/739)}
Definition statement

This place covers:

Generation of a trailer, i.e. selected scenes from the original video, or any edited version from an original, e.g. previews.

References
Limiting references

This place does not cover:

Retrieval in video databases by using presentations in form of a video summary

G06F 16/739

Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
Definition statement

This place covers:

Reference between one of the component and anything else, e.g. between two TV programs or a program and additional information on the internet (URL) or to shopping information. Covers also ATVEF triggers in general.

{by creating hot-spots}
Definition statement

This place covers:

Objects or regions of the visual content are associated to further resources, e.g. hypervideo. Excludes URLs. Covers details of marking regions of an image.

{by using a URL (processing chained hypermedia data for information retrieval G06F 16/94; information retrieval from the Internet by using URLs G06F 16/955; URL in broadcast information H04H 20/93; Web-based protocols H04L 67/02)}
Definition statement

This place covers:

Multimedia components are linked in the editing process to internet resource, with the WWW server. This place is used to describe the automatic access to a WWW server via an embedded URL.

Example(s) of documents found in this subgroup: EP1850594

References
Limiting references

This place does not cover:

This group does not cover: Processing chained hypermedia data for information retrieval

G06F 16/94

Information retrieval from the Internet by using URLs

G06F 16/955

URL in broadcast information

H04H 20/93

Web-based protocols

H04L 67/02

Cameras or camera modules comprising electronic image sensors; Control thereof
Definition statement

This place covers:

Processes and apparatus related to the concept of electronic image capture using an electronic image sensor and the related control and processing of the generated electronic image signals.

Image pickup devices using electronic image sensors such as digital cameras, video cameras, TV cameras, CCTV cameras, surveillance cameras, camcorders, digital cameras embedded in mobile phones, aspects peculiar to the presence of electronic image sensors in electronic still cameras, digital still cameras, etc.

Electronic image capture by methods or arrangements involving at least the following step: the scanning of a picture, i.e. resolving the whole picture-containing area or scene into individual picture-elements and the derivation of picture-representative electrical signals related thereto, simultaneously or in sequence, e.g. by reading an electronic solid-state image sensor [SSIS] pickup device (e.g. CCD or CMOS image sensor) as an electronic image sensor converting optical image information into said electrical signals.

In colloquial speech said step is frequently formulated as, e.g. capturing a video sequence, digital photographing, etc.

Concerning cameras:

  • video cameras, TV cameras (e.g. in studios), CCTV cameras, surveillance cameras, camcorders; constructional and mechanical details related to such cameras even when not peculiar to the presence of the electronic image sensor e.g. housings;
  • arrangements/methods for image capture using an electronic image sensor, i.e. (i) sensor read-out; (ii) processing or use of electrical image signals from the electronic image sensor for the generation of camera control signals;
  • for controlling the electronic image sensor or its read-out for, e.g. exposure, scene selection for auto-focusing or electronic image enhancement, or processing of image signals captured by the electronic image sensor, e.g. white balance, electronic motion blur correction, noise suppression;
  • for controlling other camera functions, e.g. exposure, anti-shake compensation by influencing optical parts of the camera, focusing;
  • in-camera image processing, e.g. correction of lens distortion, defective pixel correction, noise suppression, removal of motion blur, improving the dynamic range of the final image;
  • electronic viewfinders, control of image pickup devices based on information displayed by the electronic viewfinder;
  • electrical and mechanical aspects of camera modules using electronic image sensors and related constructional details as in webcams or mobile phones;
  • remote control of cameras peculiar to the electronic image sensor, e.g. affecting their operation, or being based on a generated image signal;
  • adaptations peculiar to the presence or use of an electronic image sensor, the transmission, recording or other use of electrical image data and related circuitry, e.g. mounting of electronic image sensor, integrated cleaning system for the electronic image sensor, dust mapping, cooling of the electronic image sensor, controlling the operation of the electronic image sensor by external input signals;
  • cameras wherein the inventive contribution lies in the interaction of features covered above with those covered by G03B, e.g. switch-over between electronic motion-blur correction of electronic viewfinder during focusing and optical motion-blur correction of the lens during exposure, electronic-motion blur correction of the electronic image signal based on output signals of additional sensor or interaction between mechanical shutter and electronic control of the charge accumulation period of the electronic image sensor;
  • applications concerning studios and image capturing devices that cannot be classified in lower groups such as camera operation in general, e.g. for studio or TV events, processing for simulating film artefacts, virtual studio, virtual depth image, video assist systems, other studio equipment, e.g. autocues and teleprompters.
Relationships with other classification places

Groups in G03B are to be considered when the following aspects are concerned:

  • apparatus/methods for taking photographs using light sensitive film for image capture, apparatus/methods for printing, projecting or viewing images using film stock, photographic film or slides by optical means, e.g. mounting of optical elements, flashes, and their related controls, e.g. exposure, focus, (opto-)mechanical motion blur (anti-shake), cooling, beam shaping;
  • aspects of apparatus/methods for taking photographs using electronic image sensors for image capture, insofar as they correspond to those of said apparatus/methods for taking photographs using light-sensitive film, i.e. not peculiar to the presence or use of the electronic image sensor, e.g. mounting of optical elements or flashes, and their related controls insofar as they are not peculiar to the presence or use of the electronic image sensor, e.g. exposure, focus, (opto-)mechanical motion blur correction (anti-shake);
  • optical viewfinders;
  • remote control of cameras not peculiar to the electronic image sensor, e.g. not affecting their operation, or being based on a generated image signal;
  • optical aspects of camera modules using electronic image sensors and related constructional details (e.g. lens actuators).

The following scheme is intended to illustrate the relationship between H04N and G03B:

media150.png

The above image is intended to illustrate the relationship between H04N and G03B.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Videophones

H04N 7/14

Closed circuit television systems

H04N 7/18

Cameras adapted for vehicles

B60R 1/00

Image or video recognition or understanding

G06V

Surveillance systems with alarms

G08B 13/194 - G08B 13/196

Mobile phones

H04M 1/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Intermediate information storage using still video cameras

H04N 1/2112

Video recording

H04N 5/76

Testing of cameras

H04N 17/00

Cameras used as input-only client peripherals for selective content distribution

H04N 21/4223

Circuitry of solid-state image sensors [SSIS] or control thereof

H04N 25/00

Radiation diagnosis, diagnostic aspect of medical imaging devices

A61B, A61C

Pyrometry, measuring temperature

G01J 5/00

Measuring X-rays, gamma radiation

G01T 1/00

Optical systems

G02B

Apparatus or arrangements for taking photographs

G03B

Image processing in general, i.e. not being exclusively adapted to be used in an image pickup device containing an electronic image sensor, or in studio devices or equipment

G06T

Editing of recorded image information

G11B 27/00

Associated working of recording or reproducing apparatus with TV camera or receiver in which the television signal is not significantly involved

G11B 31/006

Electric discharge tubes

H01J

Semiconductor technology of solid-state imaging devices, e.g. CMOS image sensors

H01L 27/146

CCD image sensors

H01L 27/148

Broadcasting

H04H

Constructional features of telephone sets

H04M 1/02

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

additional sensor

Sensor, other than the electronic image sensor, used for controlling a camera

camera

Device capturing image information represented by light patterns reflected from or emitted by objects, and exposing a light sensitive film or an electronic image sensor during a timed exposure, usually through an optical lens, and producing an image on a light sensitive film or an electrical image information signal respectively

electronic image sensor

Optoelectronic transducer, converting optical image information into an electrical signal susceptible of being processed, stored, transmitted or displayed

electronic spatial light modulator

Optoelectronic transducer converting electric signals representing image information into optical image information

projector

Device displaying image information by projection of light patterns, usually through an optical lens, wherein the light patterns are generated by illuminating an image, e.g. film or slide, or by converting an electric image signal into an optical signal using an electronic spatial light modulator

record

Registration (e.g. of sound or images) in permanent form by optical or electrical means for later reproduction

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

ADAS

Advanced driver assistance system

ADC

Analog to digital converter

AE

Automatic exposure control

AF

Autofocus

AFE

Analog front end

AGC

Automatic gain control

AI

Artificial intelligence

ANN

Artificial neural network

APD

Avalanche photodiode

APS

Active pixel sensor

CCD

Charge-coupled device

CDS

Correlated double sampling

CFA

Colour filter array

CIS

Charge injection device

CIS

CMOS image sensor

CMOS

Complementary metal–oxide–semiconductor

CNN

Convolutional neural network

DSP

Digital Signal Processor

EMCCD

Electron multiplying charge-coupled device

ENG

Electronic news gathering

ESLM

Electronic spatial light modulator

EVF

Electronic viewfinder

EVS

Event-based vision sensor

FOV

Field of view

FPN

Fixed pattern noise

FLIR

Forward looking infrared

FPA

Focal plane array

FPD

Flat panel detector

FPGA

Field programmable gate array

GPU

Graphics processing unit

GUI

Graphical user interface

HDR

High dynamic range

LFM

Light flicker mitigation, LED flicker mitigation

LWIR

Long wavelength infrared

MWIR

Mid wavelength infrared

MTF

Modulation transfer function

NIR

Near infrared

NN

Neural network

NUC

Non-uniformity correction

OVF

Optical viewfinder

PD

Phase detection (pixel), phase difference (pixel)

PDAF

Phase-detection autofocus

PMD

Photonic mixer device

PTZ

Pan tilt zoom

QIS

Quanta image sensor

QWIP

Quantum well infrared photodetector

ROIC

Readout integrated circuit

SBNUC

Scene-based non-uniformity correction (NUC)

SPAD

Single-photon avalanche diode

SPD

Single-photon detection

SSIS

Solid state image sensor

SWIR

Short wavelength infrared

TDI

Time delay and integration

TEC

Thermoelectric cooler

TFA

Thin film on ASIC

TOF

Time of flight

WDR

Wide dynamic range

In patent documents, the following words/expressions are often used as synonyms:

"digital camera", "camcorder", "video camera", "still video camera", "camera" and "digital still camera"

for generating image signals from visible and infrared light wavelengths
Definition statement

This place covers:

Camera architectures:

  • for generation of colour signals by using switchable colour filters or light sources, or by using different image sensors;
  • for generation of RGB; RGBIR; RGBW; RW; R+(N)IR, G+IR, B+IR, W+R signals;
  • comprising visible and IR sensors;
  • comprising partial IR filters;
  • comprising visible light sensors without IR filter, i.e. a pixel captures both visible and IR light (Y+IR);
  • comprising switchable IR filters, i.e. the pixels are controlled to capture either only the visible light (Y) or both visible and IR light (Y+IR);
  • comprising multiple image sensors, at least one of which is sensitive to IR light.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangement of colour filter arrays [CFA] or filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths

H04N 25/131

Investigating the spectrum

G01J 3/28

Imaging spectrometer

G01J 3/2823

Special rules of classification

Image sensors comprising pixels sensitive to visible light and IR light and image sensors comprising pixels sensitive to both visible and IR light (Y+IR) and pixels sensitive to IR light (IR) are classified in group H04N 25/131.

media151.png

Optical arrangements associated therewith, e.g. for beam-splitting or for colour correction
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Beam splitting or combining systems per se

G02B 27/10

using opto-mechanical scanning means only
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Scanning by optical-mechanical means only, applicable to television systems in general

H04N 3/02

for generating image signals from infrared radiation only
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transforming infrared radiation

H04N 5/33

for generating image signals from X-rays
Definition statement

This place covers:

Cameras or camera modules for generating image signals from X-rays.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transforming X-rays into electric information

H04N 5/32

Circuitry of SSIS for transforming X-rays into image signals

H04N 25/30

Measuring X-radiation, gamma radiation, corpuscular radiation or cosmic radiation

G01T 1/00

Constructional details
Definition statement

This place covers:

Constructional details of cameras or camera modules (housing, mounting of optical parts, mounting of image sensing part, other camera parts).

Relationships with other classification places

Constructional details not peculiar to the presence or use of the electronic image sensor in electronic still picture cameras, digital still picture cameras are classified in subclass G03B.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Optical systems

G02B

Electric discharge tubes

H01J

Control of cameras or camera modules
Definition statement

This place covers:

Internal or external camera control for:

  • autofocusing operations;
  • computer-aided image capturing;
  • application programs for camera control;
  • detecting malfunction;
  • face recognition;
  • generating a panoramic field of view;
  • power saving or management;
  • compensating for shutter delay;
  • changing the image capture speed;
  • performing zoom operations;
  • remote control;
  • camera shake detection or correction.

Camera control using GUI (graphics user interface).

Camera control using remote control.

Camera control via network.

Camera control in different operation modes like viewfinder or playback mode, autofocus mode, video mode or still capture mode.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for compensating brightness variation in the scene

H04N 23/70

Mountings, adjusting means or light-tight connections, for optical elements

G02B 7/00

based on recognised objects
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Image or video recognition or understanding

G06V

where the recognised objects include parts of the human body
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Recognition of human faces

G06V 40/16

Control of parameters via user interfaces
Definition statement

This place covers:

User interfaces to control camera parameters which can be separated from or integrated in the camera.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters, e.g. touchscreens

H04N 23/631

by using electronic viewfinders
Definition statement

This place covers:

Camera viewfinders displaying image signals provided by an electronic image sensor and optionally displaying additional information related to control or operation of the camera.

Relationships with other classification places

Optical viewfinders are classified in G03B 13/02.

{Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters}
Definition statement

This place covers:

A graphical user interface, e.g. a touchscreen, which is integrated on an electronic viewfinder.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Control of parameters via user interfaces

H04N 23/62

Control of camera operation in relation to power supply
Relationships with other classification places

Details of circuitry for controlling the generation or management of the power supply for a solid-state image sensor [SSIS] is classified in H04N 25/709.

Details of energy supply or management for control of exposure for digital still cameras not peculiar to the electronic image sensor are classified in group G03B 7/26.

Focus control based on electronic image sensor signals
Relationships with other classification places

Mounting of focusing coils are classified in H04N 23/54.

Focusing aids not based on image signals provided by an electronic image sensor are classified in group G03B 13/18.

Constructional details of means for focusing for cameras are classified in group G03B 13/32.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Generation of focusing signals, in general

G02B 7/28

{in combination with active ranging signals, e.g. using light or sound signals emitted toward objects}
Relationships with other classification places

Rangefinders coupled with focusing arrangements are classified in group G03B 13/20.

{Bracketing for image capture at varying focusing conditions}
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Bracketing for compensating for variations in the brightness

H04N 23/743

for stable pick-up of the scene, e.g. compensating for camera body vibrations
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Imaging systems using optical elements for stabilisation of the lateral and angular position of the image

G02B 27/64

Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Fluid-filled or evacuated lenses of variable focal length

G02B 3/14

Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors

H04N 23/58

TV type tracking system

G01S 3/7864

Analysis of motion by image processing in general

G06T 7/20

Determining position or orientation of objects by image processing in general

G06T 7/70

Tracking of movement using TV cameras of a target in burglar, theft or intruder alarms

G08B 13/19608

Circuitry for compensating brightness variation in the scene
Definition statement

This place covers:

Circuitry for compensating for variation in the brightness of the object. For example, dynamic range increase, bracketing, use of brightness histograms or brightness compensation by controlling shutter, filter, gain or illumination means.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Control of solid-state image sensor [SSIS] exposure

H04N 25/50

Informative references

Attention is drawn to the following places, which may be of interest for search:

Exposure control for film cameras or cameras using an additional sensor

G03B 7/00

Bracketing, i.e. taking a series of images with varying exposure conditions
Definition statement

This place covers:

Bracketing used for increasing the dynamic range.

Relationships with other classification places

Bracketing for image capture at varying focusing conditions is classified in group H04N 23/676.

Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for suppressing or minimising disturbance in the image signal generation

H04N 23/81

Camera processing pipelines; Components thereof
Definition statement

This place covers:

Circuitry for suppressing impulsive noise, for gamma control and for processing colour signals.

Relationships with other classification places

This group does not cover image signal processing as such or pipelines thereof which is covered by G06T 1/00.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

General purpose image data processing

G06T 1/00

for suppressing or minimising disturbance in the image signal generation
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Noise reduction or noise suppression involving solid-state image sensors

H04N 25/60

Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for suppressing or minimising impulsive noise of video signals

H04N 5/213

Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination

H04N 23/745

for controlling camera response irrespective of the scene brightness, e.g. gamma correction
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for gamma control of video signals

H04N 5/202

specially adapted for colour signals
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuits for modifying colour signals by gamma correction

H04N 9/69

for processing colour signals
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuits for processing colour signals

H04N 9/64

{Demosaicing, e.g. interpolating colour pixel values}
Definition statement

This place covers:

Demosaicing, i.e. interpolating colour pixel values, only if jointly performed in combination with pixel scanning, image readout or different video processing operations within the image sensor.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Computational demosaicing

G06T 3/4015

for matrixing
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuits for matrixing of colour signals

H04N 9/67

for controlling the colour saturation of colour signals, e.g. automatic chroma control circuits
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuits for controlling the amplitude of colour signals

H04N 9/68

for reinsertion of DC or slowly varying components of colour signals
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuits for reinsertion of DC and slowly varying components of colour signals

H04N 9/72

for colour balance, e.g. white-balance circuits or colour temperature control
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Colour balance circuits

H04N 9/73

Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
Definition statement

This place covers:

Systems using several cameras.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Constructional details of cameras

H04N 23/50

Computational photography systems, e.g. light-field imaging systems
Definition statement

This place covers:

Computational photography requiring combination of optical light modulation and computational reconstruction for acquiring dimensions of the plenoptic function. 

Light field imaging systems for light field acquisition:

  • using an array of cameras
  • using single sensor with temporal, spatial or frequency-domain multiplexing
  • temporal multiplexing with a programmable aperture
  • spatial multiplexing using an array of lens or prisms
  • frequency multiplexing by placing heterodyne mask

Camera systems comprising: Different types of image sensors, sensors of different resolutions, sensors with different field of view or focus.

Lensless imaging using:

  • coded aperture masks
  • zone plates
  • angle-sensitive pixels using diffraction gratings

Coded-aperture imaging; 

Extended Depth of Field Photography: 

  • using focal stacks 
  • focal sweep (moving the camera during the exposure) 
  • coded apertures

High speed imaging using:

  • multiple devices
  • high speed illumination
  • stroboscopic illumination
  • synthetic shutter speed imaging
Relationships with other classification places

This group covers image pickup devices using electronic image sensors for computational photography.

Devices for acquisition of colour spectrum, which is one dimension of the plenoptic function are classified in group G01J 3/00.

Pure image processing techniques used regardless of optical light modulation caused by the image pickup device are classified in groups G06T 3/00, G06T 5/00 and H04N 23/80.

High dynamic range imaging and exposure bracketing are classified in groups H04N 23/741 and H04N 23/743, respectively.

High resolution imaging by shifting the sensor relative to the scene is classified in group H04N 25/48.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Geometric image transformation in the plane of the image

G06T 3/00

Image enhancement or restoration

G06T 5/00

Circuitry of solid-state image sensors [SSIS]; Control thereof
Definition statement

This place covers:

Circuitry and driving details of solid-state image sensors, in particular the circuitry and driving details of image sensors are directed to the following purpose and functions:

  • Reading out image data from the image sensor;
  • Performing image processing within the image sensor;
  • Control of exposure time by an electronic shutter;
  • Noise removal;
  • Improvement of resolution;
  • Extension of dynamic ranges.

Solid-state image sensors encompass charge-coupled devices [CCDs], charge injection devices [CIDs], addressable photodiode arrays, complementary metal oxide semiconductor [CMOS] image sensors, etc.

media133.png

Solid-state image sensors normally capture and output image data as raw images. However, there are special image sensors that capture, process and output the image data. Details of such sensors are classified in the main group H04N 25/00, for example:

  • image sensors having on-chip compression means for data rate reduction purposes, e.g. DCT, wavelet transformation in the sensor;
  • image sensors having on-chip compression means for data rate reduction purposes by outputting differential data, such as the difference between two exposures or events detecting a predetermined change of the image signal or differences between neighbouring pixels;

media117.png

  • compressive sensing sensors

media134.png

  • image sensors performing global operations such as generation of histograms, sorting, region segmentation/labelling, convolution functions, character recognition, or detecting maximum/minimum level;
  • image sensors with edge detection in the sensor, for detecting differences between pixel signals in the spatial domain, for spatial filtering;
  • image sensors with motion or event detection in the sensor, i.e. detecting change between pixel signals over time;
  • image sensors comprising a dedicated temperature sensor or being controlled by the sensor temperature.
  • SSIS with power optimization
  • SSIS with processing time optimisation by using for example parallel processing circuitry
Relationships with other classification places

While main group H04N 25/00 is, inter alia, used for classifying electronic circuits of solid-state image sensors and their driving, control and readout, the groups in main group H01L 27/00 cover details related to the implementation of the electronic circuits on a semiconductor chip.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of scanning heads

H04N 1/024

Scanning arrangements

H04N 1/04

Receivers for pulse based Lidars

G01S 7/486

Receivers for non-pulse based Lidars

G01S 7/4912

Computer systems using neural network models

G06N 3/02

General purpose image data processing

G06T 1/00

Arrangements for image or video recognition

G06V 10/00

Imager structures consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate

H01L 27/146

Charged coupled imagers

H01L 27/148

Compressive sampling or sensing

H03M 7/30

Organic image sensors

H10K 39/32

Special rules of classification

Where the solid-state image sensor function is classified in groups H04N 25/00 - H04N 25/683 classification should also be made in the group corresponding to the sensor technology, i.e. H04N 25/71, H04N 25/76 or H04N 25/79. For example, dark current correction for CCDs should be classified in both H04N 25/63 and H04N 25/71.

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

image sensor

Sensor that detects and conveys the information that constitutes an image. An image sensor may do so by producing a signal that represents location-dependent attenuation of light (as the light passes through or reflects off a medium). The signal is an electric signal such as an electric voltage or current. The light an image sensor may detect is not limited to visible light, but can be electromagnetic radiation in other wavelengths (e.g., infrared, ultraviolet, X-rays, gamma rays).

Synonyms and Keywords

In patent documents, the following abbreviations are often used:

ADC

Analog to digital converter

AE

Automatic exposure control

AF

Autofocus

AFE

Analog front end

AGC

Automatic gain control

AI

Artificial intelligence

ANN

Artificial neural network

APD

Avalanche photodiode

APS

Active pixel sensor

BSI

Back-side illumination

CCD

Charge-coupled device

CDS

Correlated double sampling

CFA

Colour filter array

CID

Charge injection device

CIS

CMOS image sensor

CMOS

Complementary metal–oxide–semiconductor

CNN

Convolutional neural network

CTIA

Capacitive transimpedance amplifier

DPS

Digital pixel sensor

DSP

Digital signal processor

EMCCD

Electron multiplying charge-coupled device

EVS

Event-based vision sensor

FD

Floating diffusion

FOV

Field of view

FPN

Fixed pattern noise

FLIR

Forward looking infrared

FPA

Focal plane array

FPD

Flat panel detector

FPGA

Field programmable gate array

GPU

Graphics processing unit

HDR

High dynamic range

LFM

Light flicker mitigation, LED flicker mitigation

LWIR

Long wavelength infrared

MWIR

Mid wavelength infrared

MTF

Modulation transfer function

NIR

Near infrared

NN

Neural network

NUC

Non-uniformity correction

OVF

Optical viewfinder

PD

Phase detection (pixel), phase difference (pixel)

PDAF

Phase-detection autofocus

PMD

Photonic mixer device

PTZ

Pan tilt zoom

QIS

Quanta image sensor

QWIP

Quantum well infrared photodetector

ROIC

Readout integrated circuit

SBNUC

Scene-based non-uniformity correction (NUC)

SPAD

Single-photon avalanche diode

SPD

Single-photon detection

SSIS

Solid state image sensor

SWIR

Short wavelength infrared

TDI

Time delay and integration

TEC

Thermoelectric cooler

TFA

Thin film on ASIC

TIA

Transimpedance amplifier

TOF

Time of flight

WDR

Wide dynamic range

for transforming different wavelengths into image signals
Definition statement

This place covers:

  • Architectures of colour filter arrays, e.g. arrangement of the colours in the colour filter array [CFA], number of the colours in the CFA, CFA comprising white or (N)IR pixels;
  • Filter arrays characterised by the selection of primary colours, complementary colours, other colours, e.g. emerald, panchromatic filters, elements with different spectral sensitivity for the same colour, e.g. G1 and G2;
  • Elements passing: IR, RGB+IR, W+IR;
  • Random arrangement of the colour filter elements; 
  • CFA characterised by the size of the periodically replicated pattern;
  • CFA using repeating patterns with more than one elements of the same colour adjacent to each other, e.g. Quad Bayer;
  • Sensors for performing colour separation based on photon absorption depth;
  • Circuitry of the sensor for performing colour imaging operations.
including elements passing infrared wavelengths
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transforming only infrared radiation into image signals

H04N 25/20

for transforming only infrared radiation into image signals
Definition statement

This place covers:

Solid state image sensors and control thereof for near and far infrared [IR] cameras.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Transforming infrared radiation

H04N 5/33

Non-uniformity correction

H04N 25/60

Radiation pyrometry

G01J 5/00

Integrated devices comprising at least one thermoelectric or thermomagnetic element

H10N 19/00

Special rules of classification

In many cases it is necessary to add a code for an identified function or circuitry design covered in group H04N 25/00.

for transforming thermal infrared radiation into image signals
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of photometry

G01J 1/02

Electric circuits of radiation detectors for photometry

G01J 1/44

Thermography

G01J 5/48

Formed in or on a common substrate controlled by radiation

H01L 27/144

for transforming X-rays into image signals
Definition statement

This place covers:

Electronic circuitry of X-ray imaging detectors that directly or indirectly detect incident X-ray photons, including:

  • current integrating detectors (CID) or energy integrating detectors (EID);
  • photon counting detectors (PCD). Some X-ray PCDs rely on continuous time current monitoring and pulse counting implementation of photon counting. Each pixel typically contains a pulse shaping circuit along with a thresholding system connected to a counter;
  • details of generating control signals based on data from the image sensor, like irradiation start/stop detection based on dummy readouts or form signals from specific pixels;
  • operation and control of different sensor modes, like entering and control in sleep mode.
Relationships with other classification places

Issues with focus on measurement of X-radiation, on a measurement principle or its technological implementation or a dedicated measurement related circuit design should be classified in G01T 1/00.

When the imaging X-ray sensor is described with details related to systems for measuring of X-ray radiation with semiconductor detectors, classification should also be made in G01T 1/00. This is especially the case if details of circuitry for detecting, measuring or adapting the detected signal in order to obtain a correct signal are described, e.g. corrections for pile-up, for trapped charges, for dead-time, to determine energy or spatial corrections.

References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Apparatus for radiation diagnosis

A61B 6/00

Investigation of materials using radiation

G01N 23/00

Nuclear Magnetic Resonance imaging systems

G01R 33/48

X-ray apparatus or circuits therefor

H05G 1/00

Informative references

Attention is drawn to the following places, which may be of interest for search:

Transforming X-rays into electric information

H04N 5/32

Cameras or camera modules for generating image signals from X-rays

H04N 23/30

Measuring X-radiation, gamma radiation, corpuscular radiation or cosmic radiation

G01T 1/00

Circuit arrangements not adapted to a particular type of detector for measuring radiation intensity

G01T 1/17

Measuring X-radiation, gamma radiation, corpuscular radiation or cosmic radiation with semiconductor detectors

G01T 1/24

Apparatus for taking X-ray photographs

G03B 42/02

X-ray photographic processes

G03C 5/16

Image data processing

G06T

Medical informatics

G16H

Collimators

G21K 1/02

X-ray tubes

H01J 35/00

Integrated devices comprising at least one organic element specially adapted for switching

H10K 19/00

Integrated devices comprising organic radiation-sensitive element specially adapted for detecting X-ray radiation

H10K 39/36

Electric solid-state thin-film or thick-film devices

H10N 97/00

Special rules of classification

In many cases, it is necessary to add a symbol for an identified function or circuitry design covered in group H04N 25/00.

Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
Definition statement

This place covers:

Details of extracting pixel data from an image sensor by controlling scanning circuits, for example:

scanning individual pixels or pixel regions; 

media135.png

using specific scanning sequences, like scanning in blocks, pyramidal, in different directions;

media136.png

scanning and reading out data from a pixel while the pixel accumulates new charges or scanning or reading out data from a block, while the block processes the next data, normally additional storage elements like double buffers or parallel processing circuits are used, e.g. reading a pixel while the next exposure is running, reading out digital ADC data while the ADC is running the next conversion cycle, etc. 

Scanning for high-speed operations where number of frames are successively captured and stored in the sensor and then readout from the memories;

reading out more than one sensor

  • for increasing the field of view by combining the outputs of two or more sensors, e.g. panoramic imaging; 
  • having different imaging characteristics, e.g. exposure time, aperture size, gain, resolution or colour;

for performing data compression

  • by compressive sensing or sparse sampling;
  • by DCT or wavelet transforms;
  • by data differencing.

by controlling the frame rate

  • of different regions of the image array;
  • the regions being variable.

media137.png

for extracting focusing pixel data; 

by non-destructive readout to read signals two or more times during the integration time of the pixel. The figure shows non-destructive readouts at time instants from T1 to T7, while the pixel signal (72, 74 or 76) increases as a result of the exposure during the integration time T;

media138.png

for performing global operations, e.g. histogramming, sorting, region segmentation/labelling, convolution functions

  • for detecting maximum or minimum level

adapted to implement artificial neural networks [ANNs];

for push broom scanning or together with relative movement.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for scanning or addressing the pixel array in CCD sensors

H04N 25/74

Circuitry for scanning or addressing the addressed pixel array

H04N 25/779

Special rules of classification

In addition to classification in H04N 25/40, classification should also be made in H04N 25/74 or H04N 25/779 when specific details of the scanning circuits are provided. 

The readout operations in most of cases influence the exposure time of the pixels. Accordingly, classification should also be made in H04N 25/53 when details related to the control of the exposure/integration time are disclosed.

{Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors}
Definition statement

This place covers:

Arrangements and scanning details of image sensing units comprising plurality of image sensor arrays or panels, for example:

  • compound image sensing units arranged to direct light from a different section of the field of view onto different image sensors or different image sensor regions;

media76.png

media77.png

  • large X-ray image sensing unit realised by tessellating several sensor panels;

media78.png

  • image sensing units that form images of the same or at least partially overlapping photographic region upon each of a plurality of pixel regions;

media79.png

  • an imaging unit forms images of the same or at least partially overlapping photographic region upon each of a plurality of pixel regions wherein the pixels are offset at a fraction of the pixel pitch;
  • details of correction and alignment between the image sensors and the respective optical systems by selective scanning of the image sensors;
  • the image sensors may be not on the same plane or on the same chip and the optical system may comprise mirrors or prisms;
  • the image sensors or the different image sensor regions have different imaging characteristics like exposure time, aperture size, gain, resolution;

media80.png

  • the image sensors or the different image sensor regions having different focal planes;
  • the image sensors or the different image sensor regions having fields of view of different sizes;

media81.png

  • the image sensors or the different image sensor regions have different resolution;
  • the image sensors or the different image sensor regions have different colours and normally overlapping FOV;
  • the image sensors or the different image sensor regions have different colours, one of which is for IR or for depth measurement;
  • used in push broom scanning images.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Cameras or camera modules comprising electronic image sensors or control thereof for generating image signals from different wavelengths with multiple sensors

H04N 23/13

Cameras using two or more image sensors

H04N 23/45

Constructional details of television cameras

H04N 23/50

Linear arrays using abutted sensors

H04N 25/701

Modular detectors for measuring radiation intensity

G01T 1/243

by switching between different modes of operation using different resolutions or aspect ratios, e.g. switching between interlaced and non-interlaced mode
Definition statement

This place covers:

Image sensors comprising or being switchable between different readout modes, for example between interlaced or non-interlaced mode, or between high- and low-resolution modes, etc.

One of the modes can be related to readout of specific pixels only, for example there may be different modes for reading out focussing pixels and for reading out exposure pixels. The switching between different modes can be initiated, for example:

  • upon change of the camera mode - auto exposure, auto focus, AWB, preview mode, video/still picture mode or
  • upon scene parameters like motion or object detection.
by partially reading an SSIS array
Definition statement

This place covers:

Partial readout of an SSIS during one frame or sub-frame, including where the image sensor performs scanning of different image sensor regions at different resolutions.

media82.png

by reading contiguous pixels from selected rows or columns of the array, e.g. interlaced scanning
Definition statement

This place covers:

  • Scanning only selected rows or columns of the array, for example interlaced scanning or reading only every N-th line of pixels in a frame.

media83.png

Special rules of classification

Classification should also be made in H04N 25/46 if the interlaced scanning is combined with binning of the neighbouring pixels. However, if all pixel signals are readout (i.e. provided to the column output lines or to the charge transfer lines of the CCD), and then some of these are added or binned, then classification is only made in H04N 25/46.

by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
Definition statement

This place covers:

Scanning details for reading selected regions of the array, e.g. for performing electronic zooming.

media140.png

by skipping some contiguous pixels within the read portion of the array
Definition statement

This place covers:

Scanning details for thinned-out reading of pixel signals.

media86.png

by combining or binning pixels
Definition statement

This place covers:

Binning charges in CCD sensors wherein:

  • the colours of the colour filter array are preserved;
  • the colours of the colour filter array are mixed;
  • weighted addition or low pass filtering is performed.

Binning of charges or adding signals in CMOS sensors wherein:

  • the colours of the colour filter array are preserved;
  • the colours of the colour filter array are mixed;
  • weighted addition or low pass filtering is performed.

Binning of charges in CMOS sensors wherein:

  • charges of different photodiodes are added to a shared floating diffusion;
  • a photodiode is connectable to a different shared floating diffusion.

Combining of pixel voltage or current signals in CMOS sensors wherein:

  • the combining is implemented in the ADC – typically the counter or the memory of the ADC is arranged to perform addition of the pixel signals;
  • the combining is implemented in a column amplifier;
  • column processing analogue circuits are used to perform addition in h- or v- direction;
  • summing of the currents of several source followers is used.
Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data
Definition statement

This place covers:

Dynamic vision sensors [DVS]: scanning individual pixels or pixel regions based on image data, for example where the scanning is based upon time events, level changes or exposure level. The figure below shows an example of a pixel for such sensor.

media152.png

Circuitry and control thereof of DVS pixels:

  • pre-amplifier stages, which often include a current to voltage converter (e.g. block 410);
  • difference or subtractor stages (e.g. block 430);
  • comparator stages (e.g. block 440);
  • storages of event flags in DVS pixels (e.g. memories in the pixel arbiter to store events until they are read);
  • circuitry of arbiters and read-out stages of DVS pixels;
  • DVS pixels with an intensity pixel output, e.g. including an active pixel sensor [APS] pixel part. The photodiode can be shared or two separate diodes in pixels;
  • DVS pixels used to control an active pixel sensor [APS] pixel output;
  • DVS pixels with multiple thresholds to detect more than a ON/OFF event;
  • control of threshold and bias settings of DVS pixels, e.g. a global or local signal setting or controlling either bias currents or threshold voltages of DVS pixels;
  • reduction of noise events detected by DVS sensors;
  • binning or spatial filtering of DVS pixels.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Pixels for event detection

H04N 25/707

Increasing resolution by shifting the sensor relative to the scene
Definition statement

This place covers:

Circuits and arrangements for increasing the resolution by shifting the sensor relative to the scene, including:

  • implementing the micro-scanning or pixel shift by moving optical parts of the camera;
  • implementing the micro-scanning or pixel shift by moving the sensor;
  • increasing resolution by moving or exposing at subpixel positions;
  • increasing resolution by using the relative motion of the images captured caused by the camera shake.
Control of the SSIS exposure
Definition statement

This place covers:

Circuitry and means realised inside the sensor [SSIS] chip or even inside each pixel circuitry to control the exposure settings, e. g. rolling shutter, global shutter, exposure time, gain, etc.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Circuitry for compensating brightness variation in the scene

H04N 23/70

Control of the integration time
Definition statement

This place covers:

Details of control of the integration time, in particular:

  • details of performing global shutter operations in an image sensor;
  • details of performing rolling shutter operations in an image sensor;
  • integration time control and synchronisation of the electronic shutter in combination with a light source;
  • integration time control and synchronisation of the electronic shutter in combination with mechanical shutter control;
  • integration time control and synchronisation of the electronic shutter in function of motion in the scene;
  • coded exposure for flutter camera.

media87.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Control of camera brightness compensation by influencing the exposure time

H04N 23/73

Control of camera brightness compensation by influencing the scene brightness using illuminating means

H04N 23/74

Control of camera brightness compensation by influencing optical camera components

H04N 23/75

by controlling rolling shutters in CMOS SSIS
Definition statement

This place covers:

  • Details of controlling rolling shutters.

media88.png

by using differing integration times for different sensor regions
Definition statement

This place covers:

Details of controlling the integration times of different regions of the image sensor wherein:

  • the different regions can be predetermined;
  • the different regions can be dynamically selected, for example based upon exposure conditions, ROI, speed or user selection;
  • the integration time is controlled for each pixel.
Relationships with other classification places

If the control of the integration times is related to extension of dynamic range, classification in H04N 25/57 should also be considered. 

depending on the spectral component
Definition statement

This place covers:

Details of controlling the integration times depending on the colour of the pixel.

Control of the dynamic range
Definition statement

This place covers:

Circuitry and control thereof for extending the dynamic range of an SSIS. The dynamic range of an SSIS is defined as the ratio of the maximum possible, non-saturating input signal (full well capacity), versus the minimum detectable input signal limited by the total noise floor signal (in the dark). Circuitry and control thereof for converting the brightness of the scene into signal values by non-linear response function.

Circuitry and control thereof for reading image signals from which an HDR image can be generated.

involving a non-linear response
Definition statement

This place covers:

  • Controlling the sensor dynamic range using image sensors with pixel circuits having a non-linear response; 
  • Driving and control thereof.

The non-linear response can be achieved in different ways, for example, by using a specific photodetector, by controlling the reset or the transfer gate driving signals, by controlling the gain or by using non-linear amplifiers.

Relationships with other classification places

Details of control of the charge storable in the pixel are classified in group H04N 25/59.

While group H04N 25/58 covers extending the dynamic range by using multiple exposures, group H04N 25/571 covers the response characteristic (or the Opto Electronic Conversion Function) of the sensor during a single exposure.

{the logarithmic type}
Definition statement

This place covers:

  • Image sensors comprising pixel circuits having a logarithmic characteristic;
  • Image sensors comprising pixel circuits having a linear log characteristic;
  • Driving and control thereof.

media142.png

{with a response composed of multiple slopes}
Definition statement

This place covers:

Image sensors comprising pixel circuits having multi-slope characteristics and driving and control thereof.

involving two or more exposures
Definition statement

This place covers:

Details for driving and control of image sensors wherein the dynamic range is extended by multiple exposures. The term exposure is not limited to exposure time but rather specifies the overall amount of detected light, which further depends on the pixel size, pixel sensitivity, conversion gain, etc.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Combination of exposures for increasing the dynamic range

H04N 23/741

Bracketing, i.e. taking a series of images with varying exposure conditions

H04N 23/743

acquired simultaneously
Definition statement

This place covers:

Image sensors and driving circuits for controlling sensor dynamic range using two or more simultaneously acquired exposures, including:

  • providing pixels for multiple exposures, like long- and short-time exposure pixels, high- and low-sensitivity pixels;
  • reading out pixels non-destructively several times within a frame to provide multiple exposures;
  • partial readout of pixels of the array (partial charge transfer or charge skimming) during the exposure time.
with different integration times
Definition statement

This place covers:

Controlling sensor dynamic range using two or more simultaneously acquired exposures with different integration times, including:

  • providing with pixels for multiple exposures, such as long and short exposure time pixels;

media90.png

  • providing pixels that are read out several times non-destructively during a single exposure period, wherein the readout signals are combined to generate a high dynamic range signal;
  • providing pixels that have charge partially transferred to a storage node (charge skimming) during the exposure period, wherein the signals from the partial readout and from the end of exposure are combined to generate a high dynamic range signal.

media91.png

Relationships with other classification places

While group H04N 25/533 covers control of exposure time in different regions of the image sensor, group H04N 25/583 provides for the simultaneous acquisition of two or more exposures using different integration times which are combined in such a way that a new high dynamic range image signal is generated. If a partial or non-destructive readout is used only for setting the exposure period of the pixel, classification should be made in H04N 25/533.

with pixels having different sensitivities within the sensor, e.g. fast or slow pixels or pixels having different sizes
Definition statement

This place covers:

Controlling sensor dynamic range using two or more simultaneously acquired exposures using different pixel sensitivities, including the use of sensors and driving circuits with:

  • different sensitivities,
  • different sizes,
  • different conversion gains

The combination of these signals is used to generate a HDR signal.

acquired sequentially, e.g. using the combination of odd and even image fields
Definition statement

This place covers:

Driving and control of image sensors for sequentially taking multiple exposures for extending the dynamic range. The signals from the multiple exposures can be stored in the pixel or outside of the pixel array.

media92.png

with different integration times, e.g. short and long exposures
Definition statement

This place covers:

Controlling sensor dynamic range using two or more sequentially acquired exposures with different integration times, e.g. using long and short integration times.

media93.png

by controlling the amount of charge storable in the pixel, e.g. modification of the charge conversion ratio of the floating node capacitance
Definition statement

This place covers:

  • Details related to image sensors comprising pixels that can modify the charge conversion ratio of the floating node. If a transfer gate is used, the amount of electric charge generated in the photoelectric converter PD is not controlled, but rather the charge to voltage conversion ratio of the floating diffusion.

media153.png

  • Details related to image sensors comprising pixels which can store and read out overflow charges.

media95.png

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Pixel circuitry comprising storage means other than floating diffusion

H04N 25/771

Noise processing, e.g. detecting, correcting, reducing or removing noise
Definition statement

This place covers:

  • Noise processing circuits for reduction of random noise, line noise, high frequency noise, temporal noise caused by voltage drop of power supply or of driving circuits when implemented as part of the image sensor;
  • Circuits for control of bandwidth of amplifiers or comparators implemented in the image sensor as far as related to the overall noise level of the image sensor;
  • Noise processing circuits for reduction of optical crosstalk, light leakage, colour mixing and other noise originating from the components of the associated optical system;
  • Noise processing circuits for reduction of frame-to-frame variations caused by the image sensor and not by external illumination variation;
  • Image sensor noise characterisation, e.g. methods to derive parametric models to quantify different sensor noise types (such as readout noise or photo-shot noise) in the sensed image according to e.g. Gaussian, Poisson or uniform probability distribution functions; methods to calibrate and obtain noise levels of sensor data for further use, for example in filtering applications.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Camera processing pipelines or components thereof for suppressing or minimising disturbance in the image signal generation

H04N 23/81

the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
Definition statement

This place covers:

  • Circuits for detecting and correcting flare;
  • Circuits for detecting and correcting shading and vignetting;
  • Circuits for detecting and correcting geometrical distortions.
Relationships with other classification places

Although not always specific to SSIS, the noise/distortion produced by a lens is nevertheless classified in group H04N 25/61 and not in group H04N 23/81. This has been done to facilitate the search. Corrections of chromatic aberrations, which can also be related to lenses, are classified in group H04N 25/611. All other noise suppression or disturbance minimisation in picture signal generation, e.g. in a camera having an electronic image sensor, should be classified in group H04N 23/81.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Camera processing pipelines or components thereof for suppressing or minimising disturbance in the image signal generation

H04N 23/81

Correction of chromatic aberration

H04N 25/611

Geometric image transformation in the plane of the image

G06T 3/00

Image enhancement performing geometric correction

G06T 5/80

involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
Definition statement

This place covers:

Circuits for detecting and correcting noise originating from the associated optical system involving a transfer function modelling the optical system.

involving a correlated sampling function, e.g. correlated double sampling [CDS] or triple sampling
Definition statement

This place covers:

Details of circuits for implementing:

  • double sampling [DS] – these circuits compensate for offsets caused by the varying characteristics of pixel amplifiers (source followers);
  • correlated double sampling [CDS] – these circuits further reduce the kTC (reset) noise;
  • multiple sampling – multiple sampling of a reset signal and an image signal from a pixel is used to reduce or average the random noise;
  • (correlated) double/multiple sampling function implemented in the analogue domain, i.e. by using clamping circuits, or by using separate sampling capacitors for the reset signal and the image signal;
  • (correlated) double sampling function implemented at least partially in the ADC;
  • (correlated) double sampling function implemented in the digital domain;
  • CDS circuits per pixel;
  • details of arrangement of the CDS circuit as part of the readout circuit;
  • CDS arranged per column;

media143.png

  • CDS arranged at the output of the sensor.
Relationships with other classification places

If the specific position of the CDS in the image sensor is to be classified, classification should be made under H04N 25/70 according to the respective SSIS architecture. Correlated double sampling is a noise reduction technique in which the reference voltage of the pixel (i.e. the pixel's voltage after it is reset) is subtracted from the signal voltage of the pixel (i.e. the pixel's voltage at the end of integration) at the end of each integration period, to cancel kTC noise (the thermal noise associated with the sensor's capacitance). Therefore, classification should not be made in H04N 25/65 (reduction of kTC noise) if only CDS is used for kTC noise reduction.

for reducing electromagnetic interference, e.g. clocking noise
Definition statement

This place covers:

Circuits for detecting and reducing electromagnetic interferences and clocking noises.

Such electromagnetic interference can be caused by sources internal or external to the sensor, such as from lens focusing motors.

Detection or reduction of noise due to excess charges produced by the exposure, e.g. smear, blooming, ghost image, crosstalk or leakage between pixels
Definition statement

This place covers:

Circuits for detecting and reducing excess charges produced by the exposure.

for the control of blooming
Definition statement

This place covers:

  • Circuits for control of blooming by resetting pixels that are not readout but are adjacent to pixels that are readout so as to prevent saturation of non-read pixels from effecting adjacent readout pixels;
  • Circuits for sweeping out electric charges beforehand so as not to leak while one prior row is being exposed;
  • Circuits for controlling pixels comprising a storage element for storing the overflow photo-charges, the stored overflow charge is used to extend the dynamic range of the image sensor;
  • Evacuation of excess charges produced by the exposure via the output lines or the reset lines of addressed sensors.
  • Active CMOS pixels sensors comprising a dedicated reset or overflow transistor directly connected to the photoelectric converter, such a pixel is known as 5T pixel.
Relationships with other classification places

Details related to image sensors comprising pixels that can store and read out overflow charges are to be classified in group H04N 25/59.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Partially reading an SSIS array

H04N 25/44

Controlling the dynamic range by controlling the amount of charge storable in the pixel

H04N 25/59

{by controlling anti-blooming drains}
Definition statement

This place covers:

Anti-blooming drains used in the CCD sensors.

{by evacuation via the output or reset lines}
Definition statement

This place covers:

  • Evacuation of excess charges produced by the exposure via the output lines or the reset lines of addressed sensors.
  • Active CMOS pixel sensors may comprise a dedicated reset or overflow transistor directly connected to the photoelectric converter. Such a pixel is known as a 5T pixel.
for the control of smear
Definition statement

This place covers:

Circuits for control of smearing in CCD sensors – the smearing noise appears as vertical stripes in the image.

Reduction of noise due to residual charges remaining after image readout, e.g. to remove ghost images or afterimages
Definition statement

This place covers:

Circuits for reduction of residual charges.

Detection or reduction of inverted contrast or eclipsing effects
Definition statement

This place covers:

Circuits for detection and reduction of inverted contrast or eclipsing.

Eclipsing can occur when at least some pixels of the CMOS imager are exposed to strong light such as direct illumination from the sun. The strong light may cause electrons to spill over from the photodiode into the floating diffusion region, which results in an erroneous reset signal to be sampled (e.g. reset signals sampled during reset operations may exhibit voltage levels that are less than the desired reset level). Consequently, the pixel signal computed by column readout circuitry becomes an undesirably small value, the effect of which is manifested when an over-illuminated pixel appears dark while it should be bright.

A typical anti-eclipse circuit is configured to correct the voltage level of the reset signal by pulling the reset level up to a corrected voltage, thereby minimizing the eclipse effect.

media97.png

Synonyms and Keywords

In patent documents, the following words/expressions are often used as synonyms:

  • "eclipse", "darkening", "blackening", "dark defect", "black crush", "black sun", "dark sun", "black inversion", "white-black inversion", "black dot", "black grave", "black core", "black point", "tanning phenomenon", "sunspot phenomenon", "solar blackening", "blackening phenomenon", "spotlight blackening", "high-brightness darkening", "black depression", "black sinking", "high-intensity blackening".
for reducing horizontal stripes caused by saturated regions of CMOS sensors
Definition statement

This place covers:

Circuits for control of noise that appears as horizontal stripes in the image and is normally caused by voltage variations or coupling effects caused by sampling or resetting overexposed pixels. It is also called streaking, pseudo-smear or band-like pattern noise.

media144.png

applied to dark current
Definition statement

This place covers:

  • Circuits for detection and reduction of dark current.
  • Circuits performing dark frame subtraction that remove an estimate of the mean fixed pattern, but there still remains a temporal noise, because the dark current itself has a shot noise.
  • Circuits using optical black pixels for dark current compensation.
  • Circuits using optical black pixels provided for each pixel or group of pixels. 
Relationships with other classification places

The pattern of different dark currents can result in a fixed-pattern noise which is classified in H04N 25/67. Dark current is caused by charges generated in the detector when no radiation is entering the detector. Accordingly, only the fixed pattern noise caused by the dark current can be corrected or compensated. Dark current is temperature, exposure and pixel size dependent.

by using optical black pixels
Definition statement

This place covers:

Pixels shielded from incident light for detecting only dark current.

applied to reset noise, e.g. KTC noise related to CMOS structures by techniques other than CDS
Definition statement

This place covers:

Circuits for reduction of reset noise:

  • by applying soft reset or combination of soft and hard reset;
  • by feeding back the reset readout signal to the floating diffusion.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Noise processing involving a correlated sampling function, e.g. correlated double or triple sampling

H04N 25/616

applied to fixed-pattern noise, e.g. non-uniformity of response
Definition statement

This place covers:

Circuits for detection and reduction of fixed-pattern noise.

for non-uniformity detection or correction
Definition statement

This place covers:

Circuits and arrangements for correcting and detecting of non-uniformity caused by sensor characteristics such as:

  • different pixel characteristics – sensitivity, gain, offset, response curve;
  • different characteristics of sampling circuits, amplifiers, ADCs used for different groups of pixels;
  • the resistive or capacitive properties of long readout or control lines.

Circuits and arrangements for correcting and detecting of non-uniformity by

  • using dummy pixels and/or dummy structures, not OB pixels for detecting offset variations;
  • using correction circuits for correcting gain variations between pixels or groups of pixels;
  • performing measurement of the gain variations;
  • using correction circuits for correcting offset variations between pixels or groups of pixels;
  • performing measurement of the offset variations.

Non-uniformity correction modes for

  • measuring the gain responses of the pixels;
  • measuring the offset responses of the pixels.
Relationships with other classification places

There is a certain similarity between the circuits and methods for correcting dark current (H04N 25/63) and for correcting offset non-uniformities of the pixels. Since both can be temperature dependent, both can be corrected by using a dark frame.

between adjacent sensors or output registers for reading a single image
Definition statement

This place covers:

Circuits and arrangements for correcting and detecting of non-uniformity between adjacent regions or output registers.

by using reference sources
Definition statement

This place covers:

Circuits that use dedicated dummy pixels for detecting and correcting non-uniformity;

Circuits that use a reference voltage source;

Circuits that use a dark image of the scene.

based on the scene itself, e.g. defocusing
Definition statement

This place covers:

Circuits that use information from the captured image for determining non-uniformity characteristics, e.g.:

  • the scene information may be selected from expected uniform regions;
  • the scene information can be defocused to generate uniform like scene;
  • the scene can be captured by using pixel shifting, and the difference between the pixels that capture the same part of the scene can be used for detecting non-uniformity.
for reducing the column or line fixed pattern noise
Definition statement

This place covers:

Details of reducing column or line fixed pattern noise. This noise is caused by different characteristics of column parallel circuits.

applied to defects
Definition statement

This place covers:

Circuits for correction of defects caused by:

  • defects or non-responsive pixels;
  • defects in column readout lines;
  • defects in readout circuits;
  • defects in the scanning circuits;
  • defects in the control lines.
by defect estimation performed on the scene signal, e.g. real time or on the fly detection
Definition statement

This place covers:

Details of circuits that detect defects such as non-responsive pixels in real time by using the image signal.

SSIS architectures; Circuits associated therewith
Definition statement

This place covers:

Details of SSIS architecture.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Imager structures, as devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate, per se

H01L 27/146

Line sensors
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Details of scanning heads for picture-information pick-up with photodetectors arranged in a substantially linear array

H04N 1/03

SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
Definition statement

This place covers:

SSIS with

  • non-planar (e.g. foveal) or curved pixel layout;
  • non-identical or non-equidistant pixels distributed over the pixel array.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Controlling sensor dynamic range using two or more simultaneously acquired exposures using different pixel sensitivities

H04N 25/585

Imager structures

H01L 27/146

SSIS architectures incorporating pixels for producing signals other than image signals
Definition statement

This place covers:

SSIS comprising dedicated pixels or control thereof, e.g.

  • pixels specially for white balance measurement;
  • pixels for exposure or ambient light measurement;
  • pixels for triggering an exposure period;
  • pixels for edge detection;
  • pixels for event detection, for motion or difference detection or for level detection;
  • pixels for storing additional non-volatile information;
  • pixels for measuring substrate temperature.
Pixels specially adapted for focusing, e.g. phase difference pixel sets
Definition statement

This place covers:

SSIS using pixels specially adapted for focusing, including:

  • SSIS comprising phase difference pixels.
  • SSIS comprising only phase difference pixels, i.e. all pixels comprise more than one photodiode per micro lens. The photodiodes can have shared amplifiers or can be connected to different (shared) amplifiers.
References
Application-oriented references

Examples of places where the subject matter of this place is covered when specially adapted, used for a particular purpose, or incorporated in a larger system:

Focusing based on the difference in phase signals

H04N 23/672

Informative references

Attention is drawn to the following places, which may be of interest for search:

Systems for automatic generation of focusing signals using different areas in a pupil plane

G02B 7/34

Pixels for depth measurement, e.g. RGBZ
Definition statement

This place covers:

SSIS using pixels for depth measurement, e.g. using time of flight [TOF] or using photonic mixer devices [PMD].

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Image signal generators for stereoscopic video systems constituting depth maps or disparity maps

H04N 13/271

Pixels specially adapted for focusing, e.g. phase difference pixel sets

H04N 25/704

Detector arrays as receiver circuits of Lidar systems

G01S 7/4863, G01S 7/4914

Time delay measurement, e.g. time-of-flight measurement, at Lidar receivers

G01S 7/4865, G01S 7/4915

Lidar systems, specially adapted for specific applications

G01S 17/88

Pixels for event detection
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data

H04N 25/47

Circuitry for control of the power supply
Definition statement

This place covers:

SSIS with circuitry for controlling the power supply, including

  • for controlling the control signal levels;
  • for controlling different bias and reference voltages;
  • biasing circuits for adjusting or controlling the bias of the substrate or other circuitry.
Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Charge-coupled imagers

H01L 27/148

Transfer or readout registers; Split readout registers or multiple readout registers
Definition statement

This place covers:

  • Details of transfer registers;
  • Details of readout registers;

having for example changeable transfer direction

electron multiplying CCD [EMCCD]

  • Split readout registers;
  • Multiple readout registers;

for readout in H and V directions

for reading out of different colours.

Circuitry for scanning or addressing the pixel array
Definition statement

This place covers:

  • Addressing circuits for CCD pixel arrays.
  • CCD timing and clock generating circuits typically generate the vertical and horizontal sync signals VT, VH which determine the timing of vertical and horizontal scanning operations. A further driver circuit generates driving signals that force the CCD to transfer the information through the transfer registers. The parent group covers circuits for generating the driving signals and details related to the said driving signals or pulses.

media119.png

Relationships with other classification places

If the document does not provide any specific details related to the row scanning/addressing circuits but rather functionally describes details of performing different sensor readout operations, then group H04N 25/40 only should be used for classification. Similarly, if the document specifies only functional details related to control of the exposure time, then group H04N 25/50 and/or group H04N 25/57 should be used for classification.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements for selecting an address in a digital store

G11C 8/00

Circuitry for providing, modifying or processing image signals from the pixel array
Definition statement

This place covers:

Readout circuits that are applicable to a CCD image sensor.

Readout circuits for CCD sensors arranged at the output of the sensor:

  • CCD output stages like output buffers and source followers.
  • CCD output stages which are column parallel, i.e. provided for each column.

CCD circuitry for modifying or processing image signals from the pixel array.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Amplifiers per se

H03F

Analogue/digital conversion per se

H03M 1/00

Addressed sensors, e.g. MOS or CMOS sensors
Definition statement

This place covers:

Circuits of and for driving, controlling addressed sensors.

There is a wide variety of addressed image sensors using different ways of transforming light to electrical current or voltage. The following aspects are classified in this group.

Active pixels sensors [APS]:

  • using photodiodes or two terminal semiconductor elements as photodetector;
  • using Graphene Layer as photodetector;
  • using Photo-conversion layer as photodetector;
  • having pixels with small full-well capacity (200e-), high conversion gain (1 mV/e-), small pixel size (900 nm), e.g. QIS or binary pixels.

Passive pixel sensors:

  • using photodiodes or two terminal semiconductor elements as photodetector;
  • using bipolar transistors as photodetector;
  • using charge injection devices [CID];
  • charge modulation, static induction transistor [SIT] or base-stored image sensor BASIS;
  • using CMOS-CCD structures;
  • using diodes for (row) selection switches.

Bolometers used for far infrared imaging.

This group also covers addressed image sensors:

  • comprising an additional frame memory;
  • comprising testing structures;
  • implemented within a display panel;
  • providing specific details of the sensor input/output interfaces;
  • providing details of partitioning of the signal processing circuits between the sensor and another chip;
  • being a camera on chip.
References
References out of a residual place

Examples of places in relation to which this place is residual:

Detection or reduction of inverted contrast or eclipsing effects

H04N 25/627

Detection or reduction of noise due to excess charges produced by the exposure for reducing horizontal stripes caused by saturated regions of CMOS sensors

H04N 25/628

Informative references

Attention is drawn to the following places, which may be of interest for search:

Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors

H04N 25/71

Semiconductor technology of imager structures other than CCD as, e.g. CMOS

H01L 27/146

Charged coupled imagers per se

H01L 27/148

comprising control or output lines used for a plurality of functions, e.g. for pixel output, driving, reset or power
Definition statement

This place covers:

Addressed sensors comprising control lines used for a plurality of control functions, for example:

  • a control line used to control the transfer gate of one pixel and to control the reset gate of another pixel;
  • a control line used as power line, pixel select line or column output line.

media99.png

Horizontal readout lines, multiplexers or registers
Definition statement

This place covers:

  • Arrangement of scanning circuits for generating control signals for a multiplexer or an arrangement of switches that connects the column lines of the sensor array to the sensor output. In contrast to CCD sensors, addressed image sensors do not necessarily comprise transfer or readout registers that transfer the image signal to the output.
  • Details of analogue (pixel signal) shift registers and scanning circuits thereof.
  • Bucket-brigade type shift registers.
  • Details of digital (signal) shift registers and scanning circuits thereof.
  • Horizontal and vertical lines to read out the pixel array in x- and y- directions.
  • Multiple horizontal readout lines for different sensor regions.
  • Multiple horizontal readout lines for different colours.
  • Details of multiplexer or switches for horizontal scanning used for performing horizontal binning between signals from different column lines.
  • Details of multiplexer or switches for outputting signals from a column line to different readout line.

media100.png

media101.png

Relationships with other classification places

Circuits like AD converters, correlated double sampling or amplifiers provided for each column are not part of the readout registers, but all these circuits can be part of a readout circuit. Details of column parallel AD converters, CDS circuits or column amplifiers are classified in group H04N 25/78. Group H04N 25/767 covers details of how the data is transmitted to the output.

media102.png

Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
Definition statement

This place covers:

Details of pixel circuits and control thereof. However, since pixel circuits known as 3T, 4T, 5T or passive pixels are well known, these pixel structures as such are not classified in this group unless the invention relates to specific pixel properties.

Pixels characterised by their mode of operation:

  • pixels having different modes – e.g. a pixel configurable to work as TOF, as a photon counter, for event detection, as an integration pixel, etc.

media103.png

  • pixels having different read-out modes.

media104.png

Pixel details related to the pixel output interface. For example, pixels:

  • having multiple outputs;
  • having digital and analogue output;
  • having passive and active output, i.e. pixels which can be read out as passive and active pixels.
  • characterised by the type and the characteristics of the amplifier used. For example, pixels having specific details related to the source follower in the APS and of the source follower transistor, e.g. type of the SF transistor, load of the SF implemented in the pixel, control of the SF voltage;

media105.png

  • multistage amplifiers, e. g. two stage source followers;
  • multiple source followers per pixels connected in parallel;
  • distributed amplifiers, i.e. pixels comprising only part of the amplifier, the remaining part is shared for a group of pixels or for a column of pixels;
  • CTIA or common drain amplifiers, not source followers.
  • pixels characterised by the type and the characteristics of the charge transfer elements. For example, pixels:
  • with details of control of the transfer gate;
  • with details of transfer gate transistor: enhancement-, p- type;
  • with plurality of transfer gates connected in parallel;
  • with plurality of transfer gates connected in series.

Note: a plurality of transfer gates for connecting additional storage means within the pixel are classified in group H04N 25/771.

media106.png

  • having direct injection gate.

media107.png

  • having charge multiplying portion.
  • having time segregation structure for arrival time measuring.
  • reading the photocurrent.

Pixels characterised by the type and the characteristics of the reset switch. For example, pixels:

  • with reset level control;
  • with details of the reset transistor: enhancement-, p- type.

Pixels comprising control circuits using signals from the neighbouring pixels, e.g. for control of pixel conversion gain or exposure time in function of the average signal value of the neighbouring pixels.

Pixels comprising capacitors for applying control signals (RST, SEL) through it.

Special rules of classification

H04N 25/77 and subgroups do not cover associated circuits. For example, an A/D converter [ADC] in the readout circuit outside the matrix is classified in group H04N 25/78 and not in subgroup H04N 25/772.

comprising storage means other than floating diffusion
Definition statement

This place covers:

Addressed sensors with pixel circuits comprising additional storage means, i.e. storage means other than the floating diffusion.

The storage means can be analogue storage means:

  • in the charge domain;

media108.png

  • in the voltage domain, i.e. after the source follower.

The storage means can be digital memories or non-volatile memories.

The storage means are used for different purposes. For example:

  • for storing reset and exposure signals for performing CDS;
  • for storing several exposure periods;
  • for performing high frame rate imaging;
  • for performing HDR imaging;
  • for storing overflow charges during the exposure period;
  • for storing non-destructive readout signals during the exposure period.

media109.png

Relationships with other classification places

This group is not used for memories provided in the AD converters. Such pixels are classified in group H04N 25/772.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Extending the dynamic range of solid-state image sensors [SSIS] by controlling the amount of charge storable in the pixel

H04N 25/59

Noise processing involving correlated double sampling [CDS] performed in the pixel

H04N 25/616

CDS performed in readout circuits for addressed sensors

H04N 25/78

comprising A/D, V/T, V/F, I/T or I/F converters
Definition statement

This place covers:

Addressed sensors with pixels or group of pixels comprising A/D, V/T, V/F, I/T or I/F converters. The converters should be at least partially implemented in the pixel array.

Stacked chip structures in which a pixel or a group of pixels is connected to an A/D converter implemented on a different chip.

This group does not cover image sensors in which a column of pixels is connected to an A/D converter.

A/D converters can be of any type and can be specifically designed for photoelectric pixel circuits and/or to work in combination with other pixel elements like transfer gates, reset gates, source followers, etc. A/D converters can be used to convert the image signal from the pixel to a digital value. A/D converters can be used to generate a digital value for controlling different characteristics of the pixel like its exposure time or sensitivity.

Some pixels circuits comprising converters provide an analogue and a digital output or a multiplexed digital and analogue output.

The converters convert current or voltage levels to signals with different frequency – current to frequency [I/F] converter or use voltage-controlled oscillator to perform voltage to frequency conversion [V/F].

media110.png

The converters convert the signal from the photo sensor to a time-dependent signal (V/T or I/T converter). These circuits are sometimes called ADC using pulse width modulation [PWM]. A comparator measures the duration of the exposure time needed for the pixel to reach a predetermined threshold. The duration of the pulse corresponds to the pixel level. The duration of the pulse can be converted to a digital value by using a counter or to analogue signal using a ramp signal.

media111.png

The converters are ADC converters that count the number of exposure periods. These circuits are sometimes also called voltage to frequency converters or ADC using pulse frequency modulation [PFM]. The duration of each exposure period is defined by a control circuit that determines when the signal from the photodiode reaches a predetermined threshold. The control circuit normally performs a reset operation and starts the new exposure period. Note that a part of or the entire control circuit can be implemented outside the pixel array.

media112.png

The converters are part of photon counting pixels that generate one-bit signals corresponding to a detected photon, and the number of detected photons for a predetermined time is counted to provide a digital value (Details for such pixel circuits can be found in groups G01T 1/247, G01J 1/46 as a part of a radiation measuring system).

media113.png

The converters are single slope ADCs (Details of single slope ADCs as such can be found in group H03M 1/56).

The converters are flash type ADCs.

The converters are sigma delta ADCs.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Electric circuits for photometry

G01J 1/44

Semiconductor detectors for measuring radiation intensity of X-, gamma, corpuscular or cosmic radiation

G01T 1/24

Analogue/digital converters

H03M 1/12

Glossary of terms

In this place, the following terms or expressions are used with the meaning indicated:

A/D converter

Circuit for analogue-to-digital conversion [ADC] of a signal

V/T converter

Circuit for converting a pixel output voltage to a time signal

V/F converter

Circuit for converting a pixel output voltage to a frequency signal

I/T converter

Circuit for converting a pixel output current to a time signal

I/F converter

Circuit for converting a pixel output current to a frequency signal

comprising photon counting circuits, e.g. single photon detection [SPD] or single photon avalanche diodes [SPAD]
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Photometry using electric radiation detectors

G01J 1/42

Semiconductor sensitive to radiation in which the potential barrier is working in avalanche mode, e.g. avalanche photodiode

H01L 31/107

comprising amplifiers shared between a plurality of pixels, i.e. at least one part of the amplifier must be on the sensor array itself
Definition statement

This place covers:

Addressed sensors with pixel structures in which multiple photodiodes are provided. Respective transfer gates are used to transfer the charges accumulated in the photodiodes to a floating diffusion, and the floating diffusion is connected to a gate of an amplifier transistor. The amplifier is implemented within the pixel array.

media114.png

Relationships with other classification places

Passive pixel sensors comprising a shared amplifier per column are classified in group H04N 25/76.

Active pixels sensors comprising column parallel amplifiers are classified in group H04N 25/78.

Special rules of classification

Where shared pixel structures are used for different applications, classification is also made in groups relating to the application. For example, when charges of the shared photodiodes are binned in the floating diffusion classification is also made in H04N 25/46. Where shared photodiodes have different sensitivities, classification is also made in H04N 25/585. Pixels specially adapted for focusing (e.g. phase difference pixel sets) are also classified in H04N 25/704.

Shared photodiodes have different sensitivity

H04N 25/585

Pixels specially adapted for focusing, e.g. phase difference pixel sets

H04N 25/704

Charges of the shared photodiodes are binned in the floating diffusion

H04N 25/46

Circuitry for scanning or addressing the pixel array
Definition statement

This place covers:

  • Addressed image sensors such as CMOS image sensors using row and column scanning or addressing circuits.
  • Addressed sensor circuitry where the column scanning/addressing circuits are only used to provide row pixel data to the output of the sensor.
  • Addressed sensor circuitry where the row scanning/addressing circuits, in addition to the row select signals, provide further control signals to the pixels such as transfer gate [TG] or reset [RST] signals.
  • Details of scanning/addressing circuits for addressed image sensors;
  • Details related to the electronic circuitry of the scanning circuits, multiple scanning circuits, details related to the generation of driving pulses for TG, RST, ROW SEL.

media120.png

Relationships with other classification places

If the document does not provide any specific details related to the row scanning/addressing circuits but instead describes functional details of performing different sensor readout operations, then classification should only be made in H04N 25/40. Similarly, if the document only specifies functional details related to control of the exposure, then classification should be made in H04N 25/50 and/or H04N 25/57 as appropriate.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Arrangements for selecting an address in a digital store

G11C 8/00

{Circuitry for generating timing or clock signals}
Definition statement

This place covers:

Details of timing or clock signal generating circuits. These circuits drive the row electronics, the column electronics and control the readout of the pixel area.

media145.png

Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
Definition statement

This place covers:

Readout circuits for addressed image sensors defining details related to the column readout lines and the circuits associated with them. Although the readout lines are placed in the sensor array, they are a functional part of the readout circuits.

These details are, for example, readout arrangements with:

  • several column readout lines per column of pixels;
  • column lines connectable by switches to perform analogue signal averaging/binning;
  • multiple column lines are multiplexed to be processed by common processing means, like CDS, ADC, buffers;
  • column lines connectable to different processing means (CDS, ADC, buffers) to randomise the column pattern noise;
  • column lines randomly connectable to different processing means (CDS, ADC, buffers) to randomise the column pattern noise;
  • a column line being shared for pixels in a row;
  • several storage capacitors per column used for CDS, binning, multi frame storage, etc;
  • reset or clamping circuits connected to the column lines.

Details related to the load circuit, e.g. current source of the source follower and control thereof.

Details related to ADC circuits (ADC circuits as such – group H03M 1/12) used in sensor array readout circuits.

These details are for example, related to:

  • ADC type, like single slope, flash, SAR, sigma-delta, ADC combined with the gain of a programmable gain amplifier (ADC of this type as such group H03M 1/18);

media121.png

  • ADC arrangement in the readout circuit;
  • ADC arranged per-column or for group of columns;

media122.png

  • ADC arranged at the output of the sensor;
  • ADC ramp voltage generation - different slopes and directions, non-linear, ramp amplitude;
  • Processing implemented in the ADC, like CDS, binning.
  • Details related to output amplifiers:
  • CTIA amplifiers (normally used in passive image sensors);
  • Amplifiers with controllable gain GCA, PGA; 
  • Amplifiers arranged per-column or for group of columns;
  • Amplifiers arranged at the output of the sensor.
  • Details of arrangement of the CDS circuit as part of the readout circuit:
  • CDS arranged per column; 

media123.png

CDS arranged at the output of the sensor.

CDS circuits as such are classified in group H04N 25/616.

References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Amplifiers per se

H03F

Analogue/digital conversion per se

H03M 1/00

Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors
Definition statement

This place covers:

Solid state image sensor [SSIS] circuitry divided between different or multiple substrates, including:

  • Details of circuits and control thereof adapted for stacked image sensors and the like;
  • Details of partitioning the image sensor functional blocks such as the pixel array, scanning circuits, readout circuits and memories between different stacked chips;
  • Details of pixel circuitry distributed between different layers;
  • Details of ADC circuitry distributed between different layers;
  • Details of specific control arrangements or control lines adapted for stacked sensors.
References
Informative references

Attention is drawn to the following places, which may be of interest for search:

Flat panel detectors for transforming X-rays into image signals with butting of tiles

H04N 25/30

Line sensors using abutted sensors arranged in a long line

H04N 25/7013