Of course it's all part of Google's good citizen activities (as opposed to its bad citizen tax-avoiding activities) but credit where credit's due. The idea is that those who organise the world's online information (Google mostly) can ready both disparate sources of information that may be sought before, during and after a emergency or disaster, and prepare important parts of the infrastructure to cope with sudden and unexpected peaks in demand that tend to occur when something nasty happens.
Google points out that a large and increasing proportion of the world's population turn to the Internet for information, so it has developed tools to have the correct, dynamic content on hand and has been doing so since Hurricane Katrina in 2005.
It says in its submission to the committee that, to be easily integrated and disseminated in the event of a crisis, emergency information must be readily available — in open formats, open licensing structures, and already online — in advance of a disaster, not as it unfolds. Otherwise there can be delays in getting information out. Each extra step — uploading, emailing, downloading, publishing, or putting on a site — can keep critical information from getting to people in a timely manner.
Essentially, Google has been able to promote items about storm paths, shelter locations, and evacuation orders to the top of relevant search results. When Hurricane Sandy came along it took things even further: Google Search and Google Maps were enlisted to share emergency weather updates, maps of the storm path, and so on.
Google has also developed a tool called Person Finder, initially for the Haiti crisis. This pulled together all the missing/found person data that tends to grow on disparate systems as the crisis unfolds. Person finder is now in over 40 languages, has its own API and can act as a central clearing house for all the missing and found person data.
So next steps? It wants a more joined-up and thought-through general approach to support its effort, a sentiment endorsed by Robin Burton, Mobile Telecoms Relations Consultant, who we interviewed earlier this year on how SMS capabilities can be set up - through the local mobile operators - in advance of a disaster even happening.
Robin says he's seen a lot of initiatives like the Google one, which he thinks is very worthy, but many fall down because the actual implementation (in a crisis) has not been thought through. The key is the effective integration of the communications or information system with all the disparate players. And the key to that is to involve those players in the process of working out how it should all happen on the night (or across the week), he says.
Google says data providers should have their "information clearly licensed, in standard data formats, including the Common Alerting Protocol (an international standard for publishing and sharing alerts that is used by NOAA, FEMA’s iPAWS, and USGS) for Public Alerts, or GeoRSS (an open standard for encoding location information) and/or KML for Crisis Map — can update their information automatically."
It also advocates using open and common standards for sharing alerting information: Web formats like Atom and RSS (XML-based languages used for web feeds).
It's worth watching Robin Burton interviewed by Martyn Warwick below on how communications need to be prepped BEFORE a disaster strikes.
Email Newsletters
Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.