OGC’s Chief Technology Officer (CTO) is responsible for developing OGC’s Technology Strategy and coordinating the strategy across all programs of the Consortium. Currently the main activities of the strategy relate to the OGC Architecture Board (OAB), OGC Technology Forecasting, and the evolution of the OGC Baseline. This column provides an update on those and related activities from the past three months.
George Percivall's blog
This blog highlights several recent innovations in OGC processes: Changes in the OGC Innovation Program; Community Standards in the OGC Standards Program; and Geospatial Trends Tracking by the OGC Architecture Board.
OGC and OGC’s Testbed 11 interoperability initiative, which addressed Urban Climate Resilience, were officially recognized in the December 2014 Fact Sheet about the White House Climate Data Initiative. Also mentioned in that document were GoodCompany Ventures (GCV) and the expansion of Climate Ventures 2.0.
A history of the OGC Interoperability Program was just published the the ISPRS International Journal of GeoInformation (IJGI). The article was written by the four people who have served as Executive Directors of the program: Jeff Harrison, Nadine Alameh, Terry Idol and George Percivall.
This is the abstract of the paper:
In April, Mike Botts, Botts Innovative Research, Inc. posed a deceptively simple question to an OGC discussion list: “How does one find existing services that implement OGC Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS) interface standards these days?” In response to such a fundamental question came an outpouring of responses on the many ways to find OGC services. How many services do you think were identified?
Nadine Alameh is leaving the OGC staff on April 30, 2014 - today - to take on a new and exciting opportunity. We wish her great success and want to thank her for her years of contributions on OGC staff.
Geospatial Data has always been Big Data. Now Big Data Analytics for geospatial data is available to allow users to analyze massive volumes of geospatial data. Petabyte archives for remotely sensed geodata were being planned in the 1980s, and growth has met expectations. Add to this the ever increasing volume and reliability of real time sensor observations, the need for high performance, big data analytics for modeling and simulation of geospatially enabled content is greater than ever. In the
Yesterday, I bought a print of an old map: Virginiae Item et Floridae. This map was drawn in the early 1600s by Jodocus Hondius, an expert cartographer who produced it from several existing maps by experts. In the style of the time, this map was created by an elite cartographer for an elite group of users.