How we use domain and social media monitoring to provide rich data about companies
As a credit reference agency we belive rich trusted data is important to make the right decisions
and can also be used to detect anomalies and increase sales by delivering better prospecting tools.
We continously monitor domains belonging to a business, their subdomains, issued certificates,
DNS records and actual content on the pages including usage of payment methods and other services.
We have a distributed large network of crawlers that monitors both zone files for selected domains
and certificate transparency entries. Certificate
transparency logs contains issued certificates and allows us to detect some of the used subdomains and
other important metrics.
Certificate transparency logs contains the entries for issued certificate which reveals information about
subdomains. It can also reveal which companies are using services which can be used for anomaly analysis
and prospecting.
We monitor DNS records and their changes daily and make it simple to consume anomalies in modifications
such as try to hide the destination IP-address by proxying calls.
In compairsion to traditional web crawling technology that visits web pages periodically and crawls content and links
we use a distributed network of headless browsers to simulate real users and allows crawling HTML generated
by front-end frameworks such as React, Vue, Svelte, Angular and many more. We also automatically detect
front-end APIs and tech stack. Last we also detect usage of Stripe tokens and other payment
gateways.It’s similar to Wayback Machine which is an archive of homepages
found on the internet but our method is tailored for anomaly analysis.
We continously search for social media accounts and pages linked to companies such as Facebook, LinkedIn, X,
Instagram and many more. We look for example at number of posts, last date of post, reviews which is
bundled into anomaly metrics for businesses.