Info
Content

How ConsentManager handles Search Engine Crawlers

In order to provide an adequate picture of the website to search engines, we will not block the consent layer from search engine spiders. This allows the search engine to verify that the website is using measures to prevent visitors from data processing and cookies.

Anyhow, in order to avoid counting search engine robots in our reports, we have set a robots.txt file that forbids search engines from accessing our counting and consent measuring pixels.

User-agent: *
Disallow: /delivery/consent.php
Disallow: /delivery/pixel.php

 

Back to top