OT: Need recommendation for website download software

I need something to download the entire contents of a website so that I can then have a duplicate on my computer.

I have legitimate access and can print out any amount of pages but, for example, when I need to dismantle 'Part C' and the page says that you first have to 'remove Part B - click here for instructions', on the live site the http link works, but you obviously can't click on a link on a printed page and expect to be taken to the relevant part :/

Reply to
Pete Zahut
Loading thread data ...

if you have "legitimate access" is the main site based on on WordPress?

If so there a number of backup plug-ins that allow you to back up a site, download it and transfer it to a local site for development

I have used one (updraft plus) to help sort out a friends website with great success. (its free for small websites)

Reply to
Chris B

Apologies if this appears twice - I posted about 20 minutes ago and can't see it.

I have a need to download the entire contents of a website and to keep its directory structure in place so that I have an exact duplicate on my computer.

I have legitimate access and can download and print unlimited pages but my problem is that the hypertext links obviously don't work on the printed page.

As an example, I may need to replace 'Part C' but in order to do so I have to remove 'Part A' and 'P art B' first. On the live website it says "In order to replace Part C you need to first remove Part A (click here for instructions) and Part B (click here for instructions).

I need a copy of the website in its original structure so that those "Click here for instructions" links actually work.

Reply to
Pete Zahut

Look at some free web editors. I believe KompoZer will do what you want.

Tim

--

Reply to
Tim+

Sadly no, it's not based on WordPress but thanks for your reply anyway Chris.

Reply to
Pete Zahut

Tim+ explained :

Thanks Tim, I'll have a look.

Reply to
Pete Zahut

I'm confused. First you say you want to download the website to your PC then start talking about links on a printed copy.

I downloaded a website to my PC a few years ago and all the links still work.

I can't remember what I used but I'm sure Google can help you if you don't get an answer here.

The site owner made his work freely available because of a problem he was having with his ISP. As well as the copy on my PC, I made interested members of another forum aware of the imminent disappearance of a very useful resource and the forum opwner downloaded it and made it available to anyone on the web.

You can find it here:

formatting link
and you will see that all the links work, just the same as they do on the copy on my PC.

Reply to
Terry Casey

google "wget"

Reply to
Tim Watts

As you appear not to be able to access the other thread you started - which is highly unusual - here is the reply I posted there again:

I'm confused. First you say you want to download the website to your PC then start talking about links on a printed copy.

I downloaded a website to my PC a few years ago and all the links still work.

I can't remember what I used but I'm sure Google can help you if you don't get an answer here.

The site owner made his work freely available because of a problem he was having with his ISP. As well as the copy on my PC, I made interested members of another forum aware of the imminent disappearance of a very useful resource and the forum opwner downloaded it and made it available to anyone on the web.

You can find it here:

formatting link
and you will see that all the links work, just the same as they do on the copy on my PC.

Reply to
Terry Casey

wget is the ultimate tool for this, and has been so for many years. Manual online, and available for UNIX and Windows.

formatting link
The FAQ will get you the Windows binaries, and Linux/FreeBSD will have it as a package. For example, on FreeBSD it's just 'pkg install wget'.

Reply to
Bob Eager

This very much depends on how the website is organised. If it relies on scripts like PHP or a database running on the server then all bets are off unless you have admin access to the server and are also able to run the same type of webserver software on your own computer.

If the website is served entirely from HTML pages then you should be able to use wget to recursively download it into a matching directory structure on your PC,

wget is commonly found on most linux and unix systems but there are also versions available for Windows and Apple OSX. For more info see

formatting link

Reply to
Mike Clarke

wget should do what you want. Its available on most platforms. Typically command line driven:

formatting link
Windows version:

formatting link

Reply to
John Rumm

Unless the site depends on PHP scripts or a database on the server.

Reply to
Mike Clarke

Try HTTRACK

formatting link
- creates a complete copy of the website on your hard drive. And it's free! Adrian

Reply to
Adrian Brentnall

Quite simply, it's a website that holds a workshop manual for an item of machinery. I've paid a sum of money for unlimited access to the manual, which includes the right to print off any and all pages that I want, at any time. So, given the time and inclination I could print off all one thousand pages and have a hard copy here.

However, as I said above, when I come to a situation that demands half a dozen or more other items need to be removed before I get to the part I need to replace, the printed pages would not have the dynamic aspect of a hypertext-linked website, so it would take an awful lot of shuffling about of pages and extra time to get to the relevant information.

I would like a full HTML copy of the website on my laptop to refer to dynamically, in situations where I will not have any internet access and so cannot refer to the live website.

Reply to
Pete Zahut

I've not used Wget, but I have used HTTrack (for Windows) quite successfully.

formatting link

Reply to
GB

I use Core FTP LE for uploading to the sites I manage. But (provided you can connect to a site) dragging and dropping it all back to a local drive gives you a "local" copy with links that work fine.

Reply to
newshound

Well I think many people assume you made it off line, tested it and hence you already have it. I also thought many companies who offer hosting have such tools. Brian

Reply to
Brian Gaff

Does this not assume you have FTP login credentials for the web server hosting the site though?

Reply to
John Rumm

It does.

Reply to
JoeJoe

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.