OK, it's too easy just to say +1 or -1 without giving any reasons. So I'm going to explain my reasons for thinking this is a logical step in the program WITHOUT making it bloatware,
I HATE bloatware personally, so anything to avoid it is welcome. Firstly some facts (as I see them).
The vast majority of posts on usenet are made the following way:
1) Some files are on your PC that you wish to post.
2) Those files get archived (RAR is the most popular) and are split into reasonable size chunks during the process.
3) Hopefully an nfo file is made along with a sfv.
4) Quickpar is used to generate around 10-20% Par2 files for the above split RAR files.
5) A usenet posting program is opened (and after a little bit of tricky(ish) configuration) all the above files are added to the queue.
6) A naming format is required for the post.
7) Hopefully an NZB file is generated on the fly, and the queue is actioned to be posted.
Volia. Your files are now on usenet.
Let's take a quick synopsis. We needed quite a few things there:
a) Files
b) RAR
c) Quickpar/Par
d) NZB
e) News Server configured.
That's about 3 or 4 programs just to post.
Now if we look at what Alt Binz does
already, we find it almost 100% matches that list
BY ITSELF!
The fact that you can download files means your news server is configured correctly for posting (assuming your account allows you to post). The program uses RAR extensively, it uses a file manager already as a tab so pointing at files on your hard disk is already supported in the program at the moment.
Par2 is catered for extensively via the cmdline tool, and the only thing left is NZB creation which is sortof covered already by the CTRL-E shortcut in AltBinz which generates NZB files from the queue, so it must be able to implement that requirement as well.
Now I can imagine that posting is almost just a right click mouse button away. Imagine going into the file manager tab, highlighting a 400meg avi file, right click and selecting "Rar and post".
The next dialog box that pops up asks for the split size, say we enter 15megs, and it goes off and compresses/splits the avi, creates PAR2 files based on those split files then places it all in the queue for posting. By default it's also added an NZB file to that queue for the relative files, and the only thing left to you to do is select the newsgroup you wish to post into.
1 Program doing 4 or 5 jobs, all which it currently does in reverse!!
I really think this is a win/win scenario. The program already has ALL the requirements at the moment, it just doesn't have the GUI to mash it together, there's very little "bloat" to add to make this work (apart from the fact I don't know how to code and would be relying on Rdl skills!).
How many more people might actually help out with a few requests by posting to usenet if it was as easy as this? A few I hope. (I'm not actually that fussed about that part of things to be honest - but I do have a point.)
When you compare this to the nonsense of header downloading (why oh why would you bother?), the need for database space, extra memory resources, corrupt header files needing to be deleted and managed....now that sounds like bloat!
So there are my reasons. I think it's logical and reasonably well thought through. Feel free to rip to pieces ONLY if you take the time to argue your point - otherwise I'll ignore you. I can be persuaded only with a good argument!!