Well I,m sorry to hear that, but the concept of a multi server program is to my way of thinking, in order to use server for bits which can be flaky, also if you now the server is off line you might untick the box in the server setup, it will be no good reticking it to try the server if the missing article part has vanished, and yes the queue will vanish before your eyes, I had the following setup 3 server 2 with download speeds of 15k and one with 4mbs, this did not have the group needed but was ticked, they were all prim servers I ended up with a few rars with lengths of 384kb as all the other segments were deleted as the 4mbs server raced down the queue.Now I think your program looks and feels great so don't get me wrong, but with the queue being delete in the way it is, one can never be sure if the missing part is missing or just deleted from the queue you could have the situation that the file is some what new and still being uploaded, you can`t go back and finish without corrupting articles, I have been doing this from about 15 years now and have tried lots of programs, and to be blunt in my opinion the queue is a major problem if it can`t be relied upon to retain the article segments until ever thing has been tried, I am well aware that pars can repair files but if you have a fast link one wants to max it. This is intended as constructive criticism by the way, I would be interested in other peoples thoughts on this point.