But I would agree as a human I want my emotions linked to my morality. If you get to experimental with morality you can end up a psychopath or an Ayn Rand. But I have to give her credit for offering the objective definition of morality as mainly a means-ends statement with a value structure.
Alot of moralities tell you when you can stop having empathy and kill or hurt the other.
I think If you want a human morality, one easily associated with positive/negative emotions, you are very constrained in the kinds of things you can select as goals unless you are psychologically disordered. Mammals reflexively signal their emotional state to each other (all mammals read and make facial expressions) and read emotional states using empathy (unlearned) or sympathy (learned).
Im not really interested in that angle but it is true that people treat robots that display emotional expressions differently than they treat boxcar robots.
Want DP delivered to your inbox daily? Subscribe here: