Hopefully my image selection doesn’t show by bias. But seriously, I don’t think either is “better” - they approached it two different ways.
Data got the emotion chip, but had to learn how to use them. We saw that all on-screen.
Android got the emotion chip upgrade (which also seemed to come with the requisite control software) and was able to hit the ground running.
That’s a good way to differentiate their capacities for emotion: Android could experience but not express, while Data could express but not experience. Their respective chips allowed them to do both.