This is the final factor that determines the quality of the shot image. All the work done by the lens and the sensor has to be digitized and compressed before you can see final end product. Moreover, the algorithm used to process the raw data input will determine how accurate colors, color depth and resolution will be handled hence the final quality of the image.
The algorithm has to take into consideration pixel overloads, image clarity blurring due to light diffraction and other sources of noise arising from the camera component structure or the surrounding of the photo. Getting an algorithm that can properly cancel out the noise, smoothen out things and make the right color and contrast guesses will ensure that you get a better quality image.
This is partly the reason why smartphones with cameras that are same on paper might end up shooting images of different quality.
Even though you might not ask Apple or Samsung what sensor they used in their camera or the MTF factor of the optics, at least you know that Megapixels aren’t that deity we often think of them to be. Let’s just hope that they would not try to convince you that fitting 20MP on a relatively small sensor placed behind some crappy optics is good enough because you now know that it isn’t.