Author Topic: anyway to speed up decoding?  (Read 6915 times)

Rekrul

  • Guest
Re: anyway to speed up decoding?
« Reply #15 on: December 15, 2010, 03:15:47 am »
It's also a more efficient use of disk space since you're not keeping an encoded copy of the entire file.

if you set up article caching correctly it does not write to the disk until it decodes, all encoded parts are stored in ram then decoded to hdd

But then if there's a crash or something goes wrong, you lose all the parts that have been downloaded so far. True, that would probably be a rare occurrence, and it wouldn't matter so much on small files, but you don't want to get to the last part of a large post and have it screw up so that you have to re-download the whole thing.

Why not decode each part as it downloads? That way there wouldn't be 50+ parts to decode at once.

that is the normal way it works each file is decoded after download

I originally mis-interpreted the answer above. I didn't realize at the time that you thought I was simply describing the way AltBinz already works.

Let's say that someone posts a plain video file and it's in 50 parts. AltBinz will download each part and save the encoded data to the temp directory. When all 50 parts of the file have been downloaded, it will decode them back into the video file and then delete the parts. It's when AltBinz needs to decode the 50 parts that the program seems to go off into limbo.

Instead, I was suggesting that AltBinz could decode each part of each file as it downloads. After the first part is downloaded, decode it, when part 2 is finished, decode it and add it to the file. And so on. If parts 1-5 have been downloaded and then part 7 finishes, you decode that to a new file and when part 6 is done, you can merge 1-5, 6 & 7. And so on. That way, the program is never dealing with more than a few parts at a time. As I said, Binary News Reaper 2 does this and it seems to work well. There is never a pause for decoding. Of course BNR2 has a lot of other bad points, but it does the actual downloading and decoding quite well.

Another advantage to this method is that since it creates partial files as it goes, the download can be paused and the file previewed to see if it's a fake, or if it's something you want. Again, that doesn't matter so much with small files, but I've seen people post entire 500MB videos as a bare video file in 3000+ parts.

Offline Hecks

  • Contributor
  • ***
  • Posts: 2011
  • naughty cop
Re: anyway to speed up decoding?
« Reply #16 on: December 15, 2010, 08:21:53 am »
Please, as this is turning from a help thread into a feature request, could it be posted in the Requests forum along with the others in the queue. The discussion is appreciated, but note that no such changes will ever be applied to 0.25.0.

Offline davidq666

  • Contributor
  • ***
  • Posts: 1302
  • Watashi Wa Ero Desu!
Re: anyway to speed up decoding?
« Reply #17 on: December 15, 2010, 08:41:30 am »
...
Let's say that someone posts a plain video file and it's in 50 parts. AltBinz will download each part and save the encoded data to the temp directory. When all 50 parts of the file have been downloaded, it will decode them back into the video file and then delete the parts. It's when AltBinz needs to decode the 50 parts that the program seems to go off into limbo.

that is not the way alt.binz is supposed to be doing it.

Instead, I was suggesting that AltBinz could decode each part of each file as it downloads. After the first part is downloaded, decode it, when part 2 is finished, decode it and add it to the file. And so on.

that is how alt.binz is supposed to work. when files get queued for decoding there is something wrong, sometimes the nzb itself causes it, sometimes a corrupted queue file and sometimes the cpu simply lacks the power to keep up.

If parts 1-5 have been downloaded and then part 7 finishes, you decode that to a new file and when part 6 is done, you can merge 1-5, 6 & 7. And so on. That way, the program is never dealing with more than a few parts at a time.
...
Another advantage to this method is that since it creates partial files as it goes, the download can be paused and the file previewed to see if it's a fake, or if it's something you want. Again, that doesn't matter so much with small files, but I've seen people post entire 500MB videos as a bare video file in 3000+ parts.

before going into this we must differentiate between rar archives (xyz.r(ar)01,02 ect.) and split files (avi.001, 002, ect.). rar archives are never merged but can be extracted when complete, split files on the other hand get merged. alt.binz auto-repairs and extracts rar archives, but it doesn't join split files. what u discribe above is pretty much a feature called nzbplay, where alt.binz extracts whatever is in the rar as far as it is possible. it works with most rar archives, but has the drawback, that par-checking need to be stopped during that times. So if there is a curupt file it won't be repaired automaticaly and fruther extraction stops. version 0.25. has it aswell, but for it to work properly the rars need to be sorted by hand and nzbplay must be activated before starting the download. in more recent versions this is no more the case
« Last Edit: December 15, 2010, 08:43:52 am by davidq666 »

Offline Hecks

  • Contributor
  • ***
  • Posts: 2011
  • naughty cop
Re: anyway to speed up decoding?
« Reply #18 on: December 15, 2010, 08:52:48 am »
David, you are missing the issue here. ;) The point being made is about the decoding of file parts, i.e. nzb segments, i.e. yEnc encoded artices, on the fly and in RAM rather than storing them undecoded until all parts for a file are  available, and then decoding them in one operation. If there's a reason why Alt.Binz takes this approach, Rdl wll be able to explain that.

And as I said, this discussion should continue, if at all, in the Requests forum. A Contributor who wants to see it implemented should open a new thread there and add it to the queue.

Offline davidq666

  • Contributor
  • ***
  • Posts: 1302
  • Watashi Wa Ero Desu!
Re: anyway to speed up decoding?
« Reply #19 on: December 15, 2010, 09:12:10 am »
David, you are missing the issue here. ;) The point being made is about the decoding of file parts, i.e. nzb segments, i.e. yEnc encoded artices, on the fly and in RAM rather than storing them undecoded until all parts for a file are  available, and then decoding them in one operation. If there's a reason why Alt.Binz takes this approach, Rdl wll be able to explain that.

And as I said, this discussion should continue, if at all, in the Requests forum. A Contributor who wants to see it implemented should open a new thread there and add it to the queue.


maybe it depends on what he means by parts rars(individual seperate files but parts of one rar achive), split files(individual seperate files but parts of one file) or blocks.

Offline Hecks

  • Contributor
  • ***
  • Posts: 2011
  • naughty cop
Re: anyway to speed up decoding?
« Reply #20 on: December 15, 2010, 07:25:22 pm »
From the context, and since the discussion is about decoding not unraring, it seems clear that what is meant here is encoded articles, i.e. the ones identified in NZBs as 'segments' and requested by message ID from usenet servers, and currently stored by Alt.Binz before decoding the whole file that they collectively represent, which in most cases is part of a RAR archive that is then unrared.

Each yEnc encoded article contains a header and footer also in the article body, containing enough information to decode that portion of the binary file without needing any others to hand.  These can be assembled on the fly too (note: not the same as joining splitted files). I think the (legimate) question here is why Alt.Binz instead waits for them all to be available before starting any decoding. I have no idea what the answer is, perhaps there's some processing requirement that Rdl will be able to tell us about.

Rekrul

  • Guest
Re: anyway to speed up decoding?
« Reply #21 on: December 29, 2010, 10:38:42 am »
From the context, and since the discussion is about decoding not unraring, it seems clear that what is meant here is encoded articles, i.e. the ones identified in NZBs as 'segments' and requested by message ID from usenet servers, and currently stored by Alt.Binz before decoding the whole file that they collectively represent, which in most cases is part of a RAR archive that is then unrared.

Exactly.

Each yEnc encoded article contains a header and footer also in the article body, containing enough information to decode that portion of the binary file without needing any others to hand.  These can be assembled on the fly too (note: not the same as joining splitted files). I think the (legimate) question here is why Alt.Binz instead waits for them all to be available before starting any decoding. I have no idea what the answer is, perhaps there's some processing requirement that Rdl will be able to tell us about.

Actually, believe it or not, UUE articles can also be decoded one piece at a time. The encoded data in UUE articles is perfectly aligned with the lines of data, so there's no data that spans from one article to the next. Then, as long as the individual decoded segments are numbered in the correct order, they can be joined together to recreate the original file. I originally didn't think it was possible, but it works. You just need software that won't mind the lack of a "begin" line in the other parts, or you can add the line yourself to fool the decoder.