Hi! i want to do some emotion recognition from pictures, i am using the JAFFE database of Japanese models expressing different emotions in each picture. I want to work a genetic algorithm(GABIL) and multi layer neural network on the problem to compare them afterwards, but i am having troubles getting the information necessary from the pictures to pass to the algorithms. What i have done til now is pass the pictures to grayscale and what i am passing to the algorithms is the shade of each pixel(it's a 32x44 image) but i haven't got any luck yet, i was trying to simplify the approach, but i think maybe it's necessary some feature recognition to get the regions of the face that matters. Can someone pass a link to a good approach on this or give me some thoughts on the matter? i would really appreciate it.
[link][5 comments]