Our site currently uses a privately developed DDOS protection page for our end users. This requires possible bots to complete captchas, etc. This is served to every client as we have had issues with bots in the past.
We, of course, allow known Web Crawler user agents to bypass this protection page, but Surreal doesn’t seem to use a unique User-Agent. This prevents us from allowing clients from passing through the protection page to edit the actual HTML. Is this in the plans?