The problem with utility or teleology or whatever you want to call it, is that it justifies illegal and immoral actions as long as it''s "for the good of everyone". For example, animal testing. Testing products on animals was designed to protect us from dangerous products, thus for the good of humans, but was it moral to test possibly harmful products on animals with no choice in the matter? Or what about a scientist who wants to experiment on living humans to attempt to create a superhuman of some sort for "the greater good?" Is that not justified under utilitarianism (or teleology, I think they are the same thing)? Also, doesn't your philosophy just treat it as a number game, whoever saves the most lives wins, not whoever protects our basic rights? Under your philosophy, you are not held responsible for your actions as long as you save lives, and you start to blur the line between right and wrong. Is murder not always wrong? Is stealing not always wrong? If you start to allow them in certain circumstances then the line between what is right and what is wrong becomes unclear, and you have people attempting to justify there actions because they thought it was "for the good of the community." Also, we cannot predict the future, for all we know that child you are referring to will applaud your honesty or the criminal would have a change of heart, you cannot control their actions, therefore you cannot truly be responsible for them, so long as you did nothing wrong yourself.