Hi Guys, new joiner to the forums and a regular watcher of Tom on youtube.
If this post is not for this section please move.
I am looking for a way to do Web/Internet caching like a CDN and was wondering what solutions are available for this.
A long time ago I used Microsoft ISA to caching all my web traffic, Microsoft updates, etc. and I am looking for something similar.
I know that time doesn’t stand still in IT so I want to look at options.
I want to host this on my internal network so the use of an external CDN provider is not what I am looking to do.
Happy to have a look at any recommendations.
Are you looking to cache a website/internet application you run for performance and scalability (what you would typically use a CDN provider for, like Cloudflare), or are you looking to cache internet access on your local network trying to save bandwidth and increase responsiveness?
Squid, Apache, and Nginx all do those functions in both directions, set them up as “reverse proxies” to act as a cache/load balancing/security layer to web servers you operate.
As “forward” proxies, in modern era, you don’t gain much by caching anymore (used to use Squid proxies everywhere in the 1990s and early 2000s to ‘accelerate’ internet connections), as the vast majority of websites now are dynamically generated at access time and use SSL - you generally can’t really cache SSL connections without breaking SSL and becoming a man in the middle (plus cached data might be sensitive and shouldn’t be cached)…Proxy servers can still relay SSL traffic though
Proxies can be good security tools though. I have a few networks for PLCs and other control devices that have no direct routing in/out of the network, but do have a proxy server on there for things like downloading updates and looking up manuals while servicing, backing up programming and other misc needs for internet access on that network. Proxy servers are good for logging accesses also, a bit easier to interpret than a packet firewall log - if you require authentication on your proxy, you get confirmation of who is accessing. Proxy servers are also good for filtering, a little easier than setting up packet rules…Also using proxies for FTP give a more browser friendly experience for ftp, and if the proxy upstream port is in a DMZ (or direct connected to the internet), you can bypass all the ftp over NAT issues.
Thanks for the detailed response @egftechman .
My thoughts were for forward caching and I didn’t think about SSL http requests as it’s everywhere causes challenges.
Yes I am looking to increase responsiveness on my local network. I think I’ll look at setting a proxy of some kind.
Again you won’t notice a significant improvement in responsiveness, as hardly anything will be cached between SSL and dynamically driven sites…Like I said, you have to look at proxy servers as security applications now and not as “accelerators” as we used to in the 1990s… Optimizing your DNS can add a bit of “snappiness” to your internet experience, run a local DNS cache which uses the fastest upstream DNS servers for your location.
If you really want to try a caching proxy server, Squid is optimized for that and still actively maintained…
Apache and NGINX can also be configured as caching proxy servers