Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: Generic visual perception processor full report
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Generic visual perception processor



Generic visual perception processor is a single chip modeled on the perception capabilities of the human brain, which can detect objects in a motion video signal and then locate and track them in real time. Imitating the human eye s neural networks and the brain, the chip can handle about 20 billion instructions per second. This electronic eye on the chip can handle a task that ranges from sensing the variable parameters as in the form of video signals and then process it for co-Generic visual perception processor is a single chip modeled on the perception capabilities of the human brain, which can detect objects in a motion video signal and then locate and track them in real time. Imitating the human eye s neural networks and the brain, the chip can handle about 20 billion instructions per second.


1.A visual perception processor for automatically detecting an event occurring in a multidimensional space (i, j) evolving over time with respect to at least one digitized parameter in the form of a digital signal on a data bus, said digital signal being in the form of a succession aijT of binary numbers associated with synchronization signals enabling to define a given instant (T) of the multidimensional space and the position (i, j) in this space, the visual perception processor comprising: the data bus; a control unit a time coincidences bus carrying at least a time coincidence signal; and at least two histogram calculation units for the treatment of the at least one parameter, the histogram calculation units being configured to form a histogram representative of the parameter as a function of a validation signal and to determine by classification a binary classification signal resulting from a comparison of the parameter and a selection criterion C, wherein the classification signal is sent to the time coincidences bus, and wherein the validation signal is produced from time coincidences signals from the time coincidence bus so that the calculation of the histogram depends on the classification signals carried by the time coincidence bus.



2. A visual perception processor according to claim 1, further comprising, to process several parameters, several histogram calculation units organized into a matrix, whereineach of the calculation units is connected to the data bus and to the time coincidences bus.

3. A visual perception processor, comprising: data bus; a time coincidences bus; and two or more histogram calculation units that receive the data DATA(A), DATA(B), . . . DATA(E) via the data bus and supply classification information to the single time coincidences bus, wherein at least one of said two or more histogram calculation unit processes data aijT associated with pixels forming together a multidimensional space (i, j) evolving over time and represented at a succession of instants (T), wherein said data reaches said at least one calculation unit in the form of a digital signal DATA(A) in the form of a succession aijT of binary numbers of n bits associated with synchronization
signals enabling to define the given instant (T) of the multidimensional space and the position (i, j) of the pixels in this space, to which the signal aijT received at a given instant (T) is associated, said unit comprising: an analysis memory including a memory with addresses, each address associated with possible values of the numbers of n bits of the signal DATA(A) and whose writing process is controlled by a WRITE signal; a classifier unit comprising a memory intended for receiving a selection criterion C of the parameter DATA(A), said classifier unit receiving the signal DATA(A) at the input and outputting a binary output signal having a value that depends on a result of the comparison of the signal DATA(A) with the selection criterion C; a time coincidences unit that receives the output signal from the classifier unit and, from outside the histogram calculation unit, individual binary enabling signals affecting parameters other than DATA(A), wherein said time coincidences unit outputs a positive global enabling signal when all the individual time coincidences signals are positive; a test unit; an analysis output unit including output memory; an address multiplexer; an incrementation enabling unit; and a learning multiplexer; wherein a counter of each address in the memory corresponds to the value d of aijT at a given instant, which is incremented by one unit when the time coincidences unit outputs a positive global enabling signal; wherein the test unit is provided for calculating and storing statistical data processes, after receiving the data aijT corresponding to the space at an instant T, a content of the analysis memory in order to update the output memory of the analysis output unit, wherein the output memory is deleted before a beginning of each frame for a space at an instant T by an initialization signal;

wherein the learning multiplexer is configured to receive an external command signal and initiate an operation according to a learning mode in which registers of the classifier unit and of the time coincidences unit are deleted when starting to process a frame, wherein the analysis output unit supplies values typical of a sequence of each of these registers.
4. A visual perception processor according to claim 3, wherein the memory of the classifier is an addressable memory enabling real time updating of the selection criterion C and having a data input register, an address command register and a writing command register, receiving on its input register the output from the analysis memory and a signal End on its writing command register, the processor further comprising a data input multiplexer with two inputs and one output, receiving on one of its inputs a counting signal and on its other input the succession of data aijT to the address command of the memory of the classifier and an operator OR controlling the address multiplexer and receiving on its inputs an initialization signal and the end signal END.

5. A visual perception processor according to claim 4, wherein the space (i, j) is two-dimensional and wherein the signal DATA(A) is associated with the pixels of a succession of images.

6. A visual perception processor according to claim 3, further comprising means for anticipating the value of the classification criterion C.

7. A visual perception processor according to claim 6, wherein the means for anticipating the value of the classification criterion C comprises memories intended for containing the values of statistical parameters relating to two successive frames T0 and T1.
8. A visual perception processor according to claim 7, wherein the statistical parameters are the average values of the data aijT enabled.

9. A visual perception processor according to claim 3, wherein the analysis output register stores in its memory at least one of the following values: the minimum 'MIN', the maximum 'MAX', the maximum number of pixels for which the signal Vijt has a particular value 'RMAX', the particular value corresponding POSRMAX, and the total number of enables pixels 'NBPTS'.

10. A visual perception processor according to claim 3, wherein the statistical comparison parameter used by the classifier is RMAX/2.
11. A visual perception processor according to claim 3, further comprising a control multiplexer configured to receive at its input several statistical parameters and wherein the comparison made by the classifier depends on a command issued by the control multiplexer.

12. A visual perception processor according to claim 3, wherein the memory of the classifier includes a set of independent registers D, each comprising one input, one output and one writing command register, wherein the number of these registers D is equal to the number n of bits of the numbers of the succession Vijt, the classifier further comprising a decoder configured to output a command signal corresponding to the related input value (address) and a multiplexer controlled by this input value, thus enabling to read the chosen register.

13. A visual perception processor according to claim 12, further comprising register input multiplexers, each being associated with the input of a register, and combinatory modules connecting the registers to one another, wherein the register input multiplexers are configured to choose between a sequential writing mode and a writing mode common to all the registers connected together by the combinatory modules.

14. A visual perception processor according to claim 13, wherein the combinatory modules comprise a morphological expansion operator including a three-input logic unit 'OR', wherein the first input unit receives the output signal of the 'Q'-order register, wherein the second input unit is connected to the output of a two-input logic unit 'AND' receiving respectively the output signal of the 'Q 1'-order register and a positive expansion signal, and wherein the third input unit is connected to the output of a two-input logic unit 'AND' receiving respectively the output signal of the 'Q-1'-order register and a negative expansion signal.




15. A visual perception processor according to claim 14, wherein the combinatory modules comprise a morphological erosion operator including a three-input logic unit 'AND', wherein the first input unit receives the output signal of the 'Q'-order register, wherein the second input unit is connected to the output of a logic unit 'AND', wherein one four-input reverse receives respectively the output signal of the 'Q'-order register, the output signal of the 'Q-1'-order register, the output signal of the 'Q 1'-order register and a negative erosion signal, and wherein the third input unit is connected to the output of a four-input logic unit 'AND', wherein one reverse receives respectively the output signal of the 'Q'-order register, the output signal of the 'Q-1'-order register, the output signal of the 'Q 1'-order register and a negative erosion signal.

16. A histogram calculation unit according to claim 14, wherein each combinatory module comprises a multiplexer associating a morphological expansion operator and a morphological erosion operator.

17. A visual perception processor according to claim 3, wherein the histogram calculation units are organized into a matrix.

18. A device for detecting one or more events including aural and/or visual phenomena, the device comprising: a controller coupled to a controller bus and a transfer bus; an input portal adapted to receive data describing one or more parameters of the event being detected; and a data processing block coupled to the input portal, the transfer bus and the controller bus, the data processing block including: a histogram unit coupled to the input portal and configured to calculate a histogram for a selected parameter; a classification unit coupled to the input portal and the histogram unit, and configured to determine the data in the histogram that satisfy a selected criterion, and to generate an output accordingly, the classification unit supplying the output to the transfer bus; and a coincidence unit coupled to receive the output of the classification unit from the transfer bus and to receive selected coincidence criteria from the controller bus, the coincidence unit being configured to generate an enable signal for the histogram unit when the output of the classification unit satisfies the selected coincidence criterion, wherein classification is performed automatically by processing statistical information associated with the calculated histogram.

19. The device of claim 18, wherein the classification unit includes a memory table for storing selection criteria, and wherein automatic classification involves updating the selection criteria in the memory table based on the processed statistical information.

20. The device of claim 19, wherein the processed statistical information includes a value RMAX defining the number of data points at the maximum of the calculated histogram, and wherein automatic classification involves updating the selection criteria in the memory table based on the value RMAX.
21. The device of claim 18, wherein the classification unit includes a memory table for storing selection criteria, and wherein automatic classification involves changing an address input to the memory table based on the processed statistical information.


22. A device for detecting one or more events including aural and/or visual phenomena, the device comprising: a controller coupled to a controller bus and a transfer bus;

an input multiplexer adapted to receive data describing one or more parameters of the event being detected, and to output data describing a selected one of the one or more parameters in response to a selection signal; and a data processing block coupled to the multiplexer, the transfer bus and the controller bus, the data processing block including: a histogram unit coupled to the input portal and configured to calculate a histogram for the selected parameter; a classification unit coupled to the input portal and the histogram unit, and configured to determine the data in the histogram that satisfy a selected criterion, and to generate an output accordingly, the classification unit supplying the output to the transfer bus; and a coincidence unit coupled to receive the output of the classification unit from the transfer bus and to receive selected coincidence criteria from the controller bus, the coincidence unit being configured to generate an enable signal for the histogram unit when the output of the classification unit satisfies the selected coincidence criterion.

23. A device for detecting one or more events including aural and/or visual phenomena, the device comprising: a controller coupled to a controller bus and a transfer bus; an input portal adapted to receive data sets describing one or more parameters of the event being detected, each data set being associated with an instant of time; and a data processing block coupled to the input portal, the transfer bus and the controller bus, the data processing block including: a histogram unit coupled to the input portal and configured to calculate a histogram for a selected parameter for a particular instant of time T1; a classification unit coupled to the input portal and the histogram unit, and configured to determine the data in the histogram that satisfy a selected criterion, and to generate an output accordingly, the classification unit supplying the output to the transfer bus; and a coincidence unit coupled to receive the output of the classification unit from the transfer bus and to receive selected coincidence criteria from the controller bus, the coincidence unit being configured to generate an enable signal for the histogram unit when the output of the classification unit satisfies the selected coincidence criterion, wherein the classification unit automatically anticipates values associated with the selected parameter at a next instant of time T2 based on statistical information associated with the calculated histograms at time T1 and at a previous time T0.

24. The device of claim 23, wherein the statistical information at each time T0 and T1 includes a value POSMOY defined as the value, for a set of parameters, which is greater than or equal to half of the values of the set of parameters.

25. The device of claim 24, wherein automatic anticipation is based on a function of POSMOY at T0 minus POSMOY at T1 (P0-P1).

26. The device of claim 25, wherein the function includes one of Y=(P0-P1), Y=a(P0-P1) b, and Y=a(P0-P1)2, where a and b are predetermined constants.

27. The device of claim 26, wherein two or more of the functions are multiplexed.

28. A method of analyzing parameters associated with an event by an electronic device, comprising:

a) receiving data sets representative of one or more parameters of the event being detected, each data set being associated with an instant of time;

b) calculating, for each instant of time, a statistical distribution, defined as a histogram, of a selected parameter of the event being detected;

c) classifying the data set by comparing its parameter values to classification criteria stored in a classification memory;

d) enabling the calculating step when classified data satisfies predetermined time coincidence criteria; and

e) anticipating values associated with the selected parameter for a next instant of time T2 based on statistical information associated with the calculated histograms at an instant of time T1 and at a previous instant of time T0.

29. A method of analyzing parameters associated with an event by an electronic device, comprising:

a) receiving data representative of one or more parameters of the event being detected;

b) calculating, for a given instant of time, a statistical distribution, defined as a histogram, of a selected parameter of the event being detected;

c) classifying the data by comparing its value to classification criteria stored in a classification memory;

d) enabling the calculating step when classified data satisfies predetermined time coincidence criteria; and

e) automatically updating, for each instant of time, the classification criteria stored in the classification memory based on statistical information associated with the histogram.

The 'generic visual perception processor (GVPP)' has been developed after 10 long years of scientific effort. The generic Visual Perception Processor (GVPP) can detect objects automatically and track their movement in real time.

GVPP, which crosses 20 billion instructions per second (BIPS), models the human perceptual process at the hardware level by mimicking the temporal and spatial functions separate from the eye-brain system. The processor sees its surroundings as a stream of histograms with respect to the location and speed of objects.

GVPP has proven to be able to learn in-place to solve a variety of pattern recognition problems. It has automatic normalization to vary object size, orientation and lighting conditions, and can work in daylight or darkness.

GVPP tracks an "object", defined as a certain set of hue, luminance, and saturation values ​​in a specific form, from frame to frame in a video sequence, anticipating where the direction is and the output edges make "differences" with the background. This means that you can track an object through various light sources or changes in size, such as when an object approaches the viewer or moves further.

The greatest performance strength of GVPP over current day vision systems is their adaptation to varying light conditions. Today's vision systems dictate even shade less lighting, and even the next generation of prototype systems, designed to work in "normal" lighting conditions, can be used only at dawn at dusk. The GVPP, on the other hand, adapts to changes in real time in lighting without recalibration, day or light.

For many decades the field of computing has been trapped by the limitations of traditional processors. Many futuristic technologies have been limited by the limitations of these processors. These limitations come from the basic architecture of these processors. Traditional processors work by cutting each complex program into simple tasks that a processor could execute. This requires the existence of an algorithm for solving the particular problem. But there are many situations where there is an inexistence of an algorithm or inability of a human to understand the algorithm.

Even in these extreme cases GVPP works fine. You can solve a problem with your neural learning function. Neural networks are extremely fault tolerant. By its design, even if a group of neurons obtain, the neural network only undergoes a mild degradation of the performance. It will not stop working abruptly. This is a crucial difference, from traditional processors, since they do not work, even if some of the components are damaged. GVPP recognizes stores, matches, and process patterns. Even if the pattern is not recognizable to a human programmer at the entrance of the neural network, it will dig it out of the input.