It's funny this thread was bumped. When I made any arguments, I was not in a rational state of mind, all things considered. I see I realized I was wrong, good thing. I still hold I'm wrong, and don't even understand why I had argued otherwise at this moment (especially because as far as I know, I've never held human intelligence/sentience/consciousness being unique when I was lucid otherwise). Goes to show how flawed the human mind is, especially when its provider, the brain (much as any organ), is damaged.
In regards to the moral issues of terminating sentient life (the purpose of this discussion, I suppose), I still hold my question about the paperwork sentience wasn't satisfactorily answered. Personally, I would have no qualms terminating (i.e., discontinuing performing the necessary calculations, not disposing of existing calculations and state; the latter would be spiteful and purposeless) the paperwork sentience. In this case, as long as the calculations can be resumed, it's more akin to putting the sentience in some suspended state of being. If the sentience doesn't want to be suspended, however, I would hold I (nor anyone else) cannot be expected to serve it by any other means than personal choice, much like I cannot expect another being to ensure my survival by force.
This is contrary to organic/naturally occurring beings. Humans can't be suspended; on death, sentience is lost permanently. We cannot copy or resume said sentience. However, this is not relevant for a computer-based sentience; the software or hardware could be replicated identically with current technology. If my body could be replaced after destruction, while my mind remains intact, there is no immediate immoral implication (unless transfer of mind/destruction of body is forced upon me, violating personal autonomy, which is a different moral issue).