The first statistic is the most simple: The number of countries that are hosting OGC services. A country for our purposes is simply defined as having a unique ccTLD (the last part of a domain:
.pl, .us, .br, .au, etc.). At the time of writing this blog post, it's 87 of the 244 defined ccTLD's. (Note this does include
.eu for the European Union which most people wouldn't actually consider a country).
GeoSeer has been live for almost 18 months now, and we've been crawling the WWW for OGC services for even longer. This means we have a trove of historical data about services, and the new stats expose some of that. If you look at the stats page now, you'll see the General Stats section has been tweaked slightly.
As well as continuing to show stats about the current state of OGC services "Now", we've added an extra column for "Ever" which shows the total numbers that we've ever found since we started doing this. Then with a little maths we show the percentage of the things we've ever found that are still alive now.
The Ephemeral Nature of Public Data
The single most glaring statistic from this historical data is that we've found a total of 4,949,124 datasets since we started crawling, but only 1,865,660 are live and active in our index right now. Or put another way, just 37.7% of the datasets hosted by OGC services that were publically available at some point in the past 18 months are still online!
And while that's the most stand-out statistic, the others also show how transient the OGC services that host these datasets are. Over the course of the past ~18 months we've found 291,779 different services, yet only 71.83% of them were online and responding on our last crawl.
The final statistic of note here is the number of hosts. These are the domain names themselves, and different subdomains are counted as different hosts (so
www.example.com is different from
ogc.example.com). Even these have experienced considerable churn over what is a relatively short period of time, with only 85.5% of hosts remaining online. We should point out that we ignore the scheme (that's the
https://) and ignore the port when we consider if something is a "host", so if a host changes from insecure to secure (and quite a few do), it won't make a difference to this statistic.
All of this change makes it harder for users to rely on this data even if they can find it. Especially for things like scientific research which relies on repeatability, including the ability for other scientists to go back and take a second look at the original data; a difficult thing to do when the datasets/services/hosts have gone offline.
This also highlights the importance of keeping data portals current. Link rot is a real thing and data curators need to ensure they maintain their portals otherwise the portals are worse than useless (because they're wasting everyone's time with bad links).
The other part of this statistics update is a collection of extent map plots that show what parts of the world have datasets. We're going to do a separate blog post about them in the future.