Cant be done sorry...I think, the nearest you will get to that is saving the site to your reading list, but, once the site is CLOSED it cant be accessed nor links executed, can you imagine the file structure complexity of a website, in basic terms its a file directory with lots of paths and pages where you can navigate on that website, storing it offline is impossible to my knowledge, but you can however store web PAGES for offline reading, links wont work if that site closes etc but if you have the web Pages stored for offline in say reading list or onenote, then you can read them even if that site disappears, once again though can you imagine the amount of work required to store every PAGE of that site individually for offline reading.
EDIT>> This might do it?
http://www.howtogeek.com/171948/how-...tire-web-site/
First comment of that link reads:- "How Can I Download an Entire Web Site?"The short answer...these days that is impossible.
You can download the HTML, CSS, JS, Images and any media files...but that is NOT the Entire website.
It would not be possible to download every PHP, XML, etc server side files because most systems don't tell you what files are needed. If you have direct links to each php file you could probably download them except for those that have die commands if they are accessed directly.
Databases are also impossible to download without permission to do so.
So with HTTRACK or Wget or any other kind of downloader...this question is not possible.
If the word "Entire" wasn't in the question then it would be a totally different conversation.
(BUT not sure if compatible with current Windows OS,
Good luck)