So you have a new site, you want to index it, cache it, and make all the magic happen so that its a little faster right? Good deal, now how do we make this happen.
We are going to assume a few items here.
1. You already have a website in this example as caching is going to be used, WordPress.
2. You have installed a caching system and configured it. Redis, LS Cache, WP Cache, something.
Now how to we cache your site for better performance, besides clicking on every single page and link you have. Pretty easy! We use wget.
wget -r -l 3 -nd --delete-after <URL>
Where:
-r
(or--recursive
) will causewget
to recursively download files-l N
(or--level=N
) will limit recursion to at most N levels below the root document (defaults to 5, useinf
for infinite recursion)-nd
(or--no-directories
) will preventwget
from creating local directories to match the server-side paths--delete-after
will causewget
to delete each file as soon as it is downloaded (so the command leaves no traces behind.)
So lets run it against your site. I will use my video site as a demo.
wget -r -l 3 -nd --delete-after https://videos.theserveradmin.com
Well its a good thing I have a super fast internet connection, as I did not think ahead about this one, and since its a video site, I just downloaded all 10 videos.
Yeah so watch your site. However once you run it against your main site, it will trigger your cache hits and off you go!