Pfsense + HAProxy + OpenVPN + DNS on CloudFlare

Hi everybody, I’m trying to setup HAProxy as Tom explained in this video so I can access my servers from inside my LAN and from LTE via OpenVPN, without exposing any of them on the internet, expecially by setting up a A record on CloudFlare but it doesn’t work to me. I keep getting a browser error saying “safari can’t connect to http://… because it could find the server. When I can connect to the server I keep getting the message as a non secure browser as if it’s ignoring the certificate settled up in HAproxy via ACME package.

Note…my local domain is and many machine are set to Dose this conflict with the fact that my FQDN is also

Can anybody help me?

If it is not sending the matching cert then what certificate is it sending?

the self sign cert so I get the not secure sign. With Safari I also get a valid certificate with the self sign certificate from truenes for example.
I’m trying to make it work over openVPN as well as you did with the lates example storing an A record in the DNS “provider” but to make it work I also need to add a dns host override in pf sense dns resolver. with some server doesn’t work at all…for example my freepbx server… a note…my internal domain settled up on each machine itself is also as my FQDN I subscribed…can this be an issue? I’m using CloudFlare to set up my dns… is there something I need to know?

Form my iPhone I can’t connect anymore to my pfsens box using the hostname + port…

The FQDN has to match the certificate that is being sent to the device.

I set up ACME+Let’s encrypt to use a while-card * and my truenas server is set to be in truenas network settings as well. It receives its IP via a reserved dhcp leases provided by Pfsense and in the dhcp setting it’s also set up to be truenas as host and as domain. I set up also HAProxy as you explained and I added an A record to my cloudflare account pointing to my lan pfsense IP. If I open safari browser and try I get a valid certificate, but it’s not the R3 * from let’s encrypt but it’s a localhost certificate…I don’t understand where I’m doing it wrong. I also don’t understand why I need to to add a host override in the dns resolver if I already added the A record in cloudflare as you explained for digitaloceans example… I hope I explained myself better, English is not my language…

If DNS is correct and you are getting the wrong cert from HAProxy then there is a setting missing in HAProxy around what cert is should be serving.

The same frontend and certificate is used with other servers, and on those other server I’m getting the R3 let’s encrypt * certificate… :man_shrugging:t2:
Why I need to set both e the host override in pfsense and the A record on cloudflare? I wanted to set it up only on cloudflare but if I only do that it doesn’t work at all…

It if is getting the correct DNS record from Cloudflare then you do not need the host over ride and the problem is also not DNS but something in HAProxy.

I just went through this process except I am using Opnsense instead of pfSense.

It might. I was doing the same until recently but now I changed my internal domain to be a sub-domain of my hostname. Make sure you don’t use the same sub-domain anywhere else. For eg. you can use the internal domains to be

If you are not exposing the services to the internet – which is what it seems like – then you have no need for any A records in the public DNS. All you need is a Cloudflare account handling your DNS as I was explained. You will need the Host Overrides in Unbound to point to the pfSense LAN IP so that HAProxy can handle it.

See this post.

I switched over from caddy2 as the reverse proxy to HAProxy in that process as well and I can confirm that I removed all my A records from my Cloudflare account and now am using Let’s Encrypt to serve certificates to all my services using HAProxy. After all that was working, I added one A record in my Cloudflare account so that I could use it in conjunction with DDNS for OpenVPN access and won’t have to change the clients each time my WAN IP changed.