ideas-philosopy-ethics-weakUtilitarianism

Define strong utilitarianism as "you should maximize utility"

Define weak utilitarianism as "if you would sacrifice A for B if you were in the situation Z except that possibly you didn't have B, and by taking A away from a person in situation Z you can give B to one or more people, then you should do so."

Strong utilitarianism has an issue that my friend R.O.F. pointed out; "should you torture one person in order to make a very large number of gerbils, who are already content, slightly more content? Should you torture one person in order to make a very large number of people who are watching giggle slightly?" Weak utilitarianism does not have this issue; it does not demand that you do something that is more bad to the losing party than it is good to the individual winning parties. However, it does demand things like taking a small amount of money from the rich to give to the poor, if the rich would be slightly less content but the poor recipients would greatly suffer less.