Welcome, Guest. Please login or register.
April 23, 2014, 16:22:50 PM
Home Help Search Calendar Login Register
Show unread posts since last visit.
News: Let Pardus-Anka become #1: Pardus-Anka Bug ReportPardus-Anka World Google+ | The Pardus wiki  | Visit Pardus-Anka official website  | Register as forum member?  Email the moderator!

+  Pardus Worldforum
|-+  Assistance
| |-+  Pardus for beginners
| | |-+  Installing programs
0 Members and 1 Guest are viewing this topic. « previous next »
Pages: [1] Go Down Print
Author Topic: Installing programs  (Read 830 times)
Posts: 1

View Profile
« on: October 04, 2008, 17:04:11 PM »

Linux is all new to me, can someone give me a step by step instruction on installing the following program

nznzb 0.70

nznzb is a multi-threaded NZB file downloader for usenet. You need
to grab a nzb file from somewhere, then pass it as an argument to
nznzb. For example:

   nznzb somethinguseful.nzb [.. anothernzb.nzb] [extract dir]

This will kick off multiple download streams to improve

Key features of nznzb:

 - Can use multiple download threads to speed up downloads

 - Can connect to multiple usenet servers.

 - Can resume if you break out of a download (it does have
   to download a fair bit to work out whether it has a complete
   file or not)

 - If you have unrar installed somewhere it can dynamically
   unrar nzb's of multipart rar archives. It makes sure it
   downloads the rars in the correct order to unrar them. This
   often means that you can watch a media file while it downloads
   simply by pointing a media player at the file being extracted,
   or perhaps you share out the directory where the media file
   is being written and can then watch it using XBMC or some other
   external media player device.

 - Can automatically launch a media player such as mplayer
   if you are downloading a multipart rar archive. It can do this
   after the first rar file downloads.
- Checks yEnc CRC codes in usenet 'segments' as they download and
  can switch to alternate usenet servers to try to grab a 'segment'
  which is not corrupt. The aim is to prevent corrupt downloads
  that might require you to use PAR2 files to fix the problem.

- If your NZB file consists of rar files and par2 files, nznzb
  can finish 'early' if it detects that the unrar'ing process
  has finished without errors (which means that the par2 files
  aren't required). This saves on traffic.


On linux:

   gcc -o nznzb -lpthread -lssl -DLINUX -DUSE_SSL nznzb.c

On Mac OS X:

   gcc -o nznzb -lpthread -lssl -lcrypto -DMACOSX -DUSE_SSL nznzb.c

(I'm not too sure about other Unix's, but if you
 have a BSD type system, try compiling it for OS X)

There's also a little script make.sh that I use to compile;

   ./make.sh nznzb

Now copy the nznzb binary to somewhere in your path:


  sudo cp nznzb /usr/local/bin

If you want to use the unrar feature, you need to have the unrar binary
somewhere in your PATH. If you want to use the auto-launching of a media
player then you need to have mplayer or some other player in your PATH or
explicitly specify the path.


You have to have a config file in your home directory

Filename is $HOME/.nznzbrc

The basic config looks like:
# comments
server = nntp.someprovider.net
port = 119


# How many download streams do you want

# Set auto_unrar = 1 if you have unrar in the PATH and want
# to dynamically unrar as you download
auto_unrar =1

# Where do you want to extract the unrar output to (or
# specify it as the last argument on the command line)


If you want to use SSL with your provider then
put in the right server and port as well as ssl=1:

server = nntps.someprovider.net
port = 563
user = fred
pass = secret
ssl = 1

user and pass are for NNTP servers that require a username and
password. Leave them out or comment them out if your server doesn't
require a user/pass

thread is for how many parallel download streams you want.

auto_unrar=1 means that if the thing you're downloading is a
multipart rar, then start unrar'ing it as soon as the first
rar part finishes downloading (you need to have unrar in your path
somewhere for this to work). For example, if the first file
is myhomevideo.rar, then that will get downloaded first,
and it'll run unrar in the background. As soon as myhomevideo.r00
downloads it'll get fed into unrar to continue the process and so on

extract_to is only relevant to the unrar process. Its the
destination directory for the unrar. If you don't specify it in the
.nznzbrc file you can put it on the command line:

   nznzb somefile.nzb /mystuff

Multiple usenet providers
A new feature of v0.64 is the ability to use multiple usenet servers. In your .nznzbrc
file you might specify three servers as follows;

  server = usenet-provider1.something.com
  port = 119
  user = someuser
  pass = somepass

  server2 = usenet-provider2.other.com
  port2 = 119
  user2 = someuser
  pass2 = somepass

  server3 = secureusenet.usenet-provider1.something.com
  port3 = 443
  user3 = someuser
  pass3 = somepass
  ssl3 = 1

By default nznzb will only use the 1st server to download stuff from. A big change to the code
is that it now checks the crc values of yenc encoded material as its downloaded. If a crc is
wrong it will try the other usenet servers (if any). The idea is that another server or other
usenet provider may have this particular segment in a non-corrupt form. This helps out
enormously if you are trying to stream a file that nznzb is downloading. Normally, nznzb
would have downloaded the corrupt segment and you'd have to wait until the end of the entire
nzb download before applying par2 files to try and fix it.

You can also split the download threads amongst all your servers. just put the following
in your .nznzbrc file;

  split_load = 1

Now, when nznzb starts up the first download thread will use the 1st server, the 2nd
download thread uses the 2nd server, the 3rd download thread uses the 3rd server (and
assuming you only specified 3 server), and the 4th download thread uses the 1st server
and so on.

If you use 'split_load', and it hits a crc error on a segment it'll try your other servers
before giving up.

If you turn split_load off (ie. = 0 ), all your downloads will use the first server
specified in .nznzbrc ... but if a crc error is encountered, and you have other
servers listed, then it will try those other servers to get the correct segment.

Automatic Media Player
A new feature in v0.65 is automatic playing of media content as it downloads. So sure
you can already stream the content that is downloading, but now nznzb can automatically
launch mplayer or some other player to play back content after the first rar part
 is downloaded.

There's quite a few assumptions with this setup;

 - You need to have auto_unrar = 1 in your .nznzbrc
 - The content you're downloading consists of a multi-part rar archive
 - You have an internet connection that is faster than the bitrate of the media
 - There's only one media file within the rar archive
 - You need to set the the new media_player variable in .nznzbrc. You could do:

      media_player = mplayer

   OR possibly

      media_player = mplayer-svn -vo xv -demuxer lavf -cache 2048

   NB: The demuxer lavf stuff is to allow you to seek back and forth in an incomplete
   media stream, but I find it only works in the latest SVN versions of mplayer. Actually
   I have a hard time getting it to work at all

   On OSX I use (use backslashes before spaces. Don't use quotes);

      media_player =  /Applications/MPlayer\ OSX.app/Contents/Resources/External_Binaries/mplayer.app/Contents/MacOS/mplayer

End early if unrar succeeds
You can force all the download threads to end if nznzb notices that
unrar has ended with a zero exit status. This is advantageous when you have
a typical nzb file that includes par2 files for extra redundancy. If
unrar succeeds OK, there's not much point in downloading the par2 files. If
you want to try this put this in the .nznzbrc;

   end_if_unrar_ok = 1

v0.70  - Fixed a big bug (Thanks to a colleague) that corrupted downloads at the time
         the media player auto launched.
         Some code to better clean up temp files if unrar ends early and also if you hit
         ctrl C it cleans up a bit better as well.
v0.68  - Finally worked out why I could never get SSL to work on OSX. All I needed was -lcrypto
         on the compile line. Doh! So now it compiles with SSL support on OSX and seems to work OK
         Fixed some bugs to do with uudecoding. Still not sure how well it works (or whether it
         works at all ... as I hardly see any uu stuff these days). Also new is being able to have
         a mix of SSL and non SSL servers listed in your .nznzbrc.

v0.67  - v0.65 did not really work on OSX, so I've been hurrying to try to fix that. Seems to go OK
         in my minimal testing. Also 0,65 when using auto mplayer launching on linux tended to only
         play a few minutes of the media and then die (mplayer that is). I think I've fixed that too.
         There's also the end_if_unrar_ok .nznzbrc setting now. NB: the ssl stuff still does not
         work on OSX

v0.65  - Automatic media player function after the first rar is downloaded. Not tested greatly.
         Occasionally it misses the name of the media file from the unrar output.

v0.64  - Downloading of >4GB nzb's should show the correct download stats now.
         Changed the yenc decode stuff so each segment is decoded one by one as
         they download (enables me to check crc errors early on). Can now specify
         multiple download servers, and auto switches when crc errors enocuntered.
         Added the 'split_load' config variable to enable spreading download threads
         across multiple servers.
         NB: the ssl setting applies to ALL servers. Sorry.
         This version is more optimized for yenc stuff, so I've probably broken
         all the uudecode stuff (again)

v0.58  - Now supports more than one nzb file on the command line. You can
         still specify an output dir for the unrar'd files. Just make sure
    you specify a directory as the last argument

v0.57  - Now creates a temp directory in your current directory called
         YYYYMMDD (eg. 20080126) and puts all the temp files in there. It makes
    it easier to clean up if you hit control C. Also Arne Haak
    reported a bug (Wow, there's at least one person who uses this
    thing). Basically some usenet hosts report a 201 response for
    'server ready - posting NOT allowed' as an alternate to the 200
    'server ready' response. I've updated to allow either. There's
    probably some other minor tweaks in the code. Check the comments
    in the source for more info.
v0.50  - 10.9.2007
         Added in SSL support, and also got the uudecode stuff going again. Not well
         tested though. yEnc stuff works very well though.
         uudecoding is handled by running the program 'uudecode'
         externally (ie. must be in your path somewhere)
         yEnc decoding is handled internally
         Can now compile on a MAC as well. Lots of ifdefs in the pseudo term
         stuff for MACs
         You can actually compile it without SSL support by commenting out the USE_SSL
         define at the top of the source code.


- Since I started writing this thing I have occasions when all the socket
  connections to my usenet host die (or more importantly fgets returns an error).
  There is quite a bit of logic to try to reconnect again and go ... but
  I'm not 100% sure its the greatest solution. Still working on it
- Change DEBUG to 1 in the source file to get more debug output
- It will tend to resume (to some degree). If you have downloaded blah.rar,
  blak.r00, blah.r01 , and then ctrl-c out and start again, it will start to
  download blah.rar, but work out it's already got it while reading the first segment.
  ie. it'll download a bit of each of those files again, before eventually
  getting to blah.r02. The stats about the ETA and transfer speed will be wildly
  wrong ... but it will finish OK.
- The mini-expect code needs to be a bit smarter. Has a horrible buffer
  limit that is annoying.
- Each download thread generates a file called 'thread_....', and each file
  segment will end up with a part_x_y file. You can generally do a
   rm thread_* part*
  before starting nznzb again
- It creates a temp directory in the current dierctory called YYYYMMDD (eg. 20080819)
  Normally this is empty after you run nznzb, but if you ctrl-c out of nznzb you'll
  end up with files left in these directores. They're good for my diag purposes,
  but you might want to rm -r 20080* or something similar.
« Reply #1 on: October 04, 2008, 22:14:31 PM »

I found this: http://forum.pardus-linux.nl/index.php?topic=282.0
I hope it is what you are looking for.
I don't know anything about the program.
It is in Dutch, but that is all I have in this moment.
Try http://translate.google.com/ to translate it into English.
Pardus fellow craft
Gender: Male
Posts: 76

View Profile
« Reply #2 on: October 07, 2008, 21:43:09 PM »

I think that's rather straightforward. You should try 'gcc -o nznzb -lpthread -lssl -DLINUX -DUSE_SSL nznzb.c' in Konsole and that's it.
Pages: [1] Go Up Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!