Welcome guest. Before posting on our computer help forum, you must register. Click here it's easy and free.

Author Topic: Any Jdownloader-Like software to download sites on PDF format?  (Read 3087 times)

0 Members and 1 Guest are viewing this topic.

ericlanser

    Topic Starter


    Newbie

    • Experience: Beginner
    • OS: Windows 10
    Basically just copy a link and then download various pages on PDF format

    I mostly need it for tutorial sites that have an index system with separate categories, i don't have a permanent connection so i usually download stuff for later reads

    Doing it one by one is kind of a pain in the *censored* and a software that does the work would be great

    nil

    • Global Moderator


    • Intermediate
    • Thanked: 15
      • Experience: Experienced
      • OS: Linux variant
      Do not communicate by sharing memory; instead, share memory by communicating.

      --Effective Go

      DaveLembke



        Sage
      • Thanked: 662
      • Certifications: List
      • Computer: Specs
      • Experience: Expert
      • OS: Windows 10
      Re: Any Jdownloader-Like software to download sites on PDF format?
      « Reply #2 on: April 30, 2019, 11:24:06 AM »
      Went looking for wget and found it here if you need it.  http://gnuwin32.sourceforge.net/packages/wget.htm

      Other references online have broken links to it as I guess its been around since 2008 and some sites and download locations have died off for it.

      I saw this and it caught my attention, so I am going to try out wget myself.

      I have used 2 other methods for offline web content. Httrack and WebThumbnailer.

      Httrack basically creates an exact copy of a website or web page with exception to features and database will not work in the copy but hyperlinks to other pages associated with the pages you copy work offline. The problem with Httrack is that if you want a copy of a website and there are downloads at that site, it will grab all downloads and so if all downloads on the site amount to 15GB, it will download 15GB unless you edit the filters and settings on what to capture for offline copy of a sites contents.

      WebThumbnailer can be given a list of URL's and it takes a picture of the web pages that are then saved as JPG's etc to where you can open up images and  zoom to get sizing correct and read the content of the webpage within usually a very large in length image but like 54k in size.

      If a website has a bunch of PDF's for free download Httrack can be used to grab every and any PDF the site has. I used it for getting a complete copy of every custom map submitted for others to have for Unreal Tournament 99 to have a complete collection of UT99 Community Created Maps to have thousands of different rooms to play the game in vs the ones that come with the game. I basically got an exact copy of the website which took 3 days and then ran a instruction to copy all *.zip files to a separate location on my computer which grabbed all the Map Files that were available as downloads in zipped files. Then I deleted the Gigs of webpages and blog and chat content that wasnt needed in the end to get back 18GB of hard drive space. But all the maps combined were 14.9GB of zip files that were gathered from this website that are all free for the public.