CalGIS 2014 – Reflections on past and future

It has now been a couple of days since the end of the CalGIS conference for 2014.  For those that do not know, CalGIS is the California GIS Conference.  It is hosted every year by the four chapters of the Urban and Regional Information Systems Association (URISA)Northern California Chapter of URISA, Central California Chapter of URISA (CentralCalURISA), Bay Area Automated Mapping Association (BAAMA), and the Southern California Chapter of URISA(SoCalURISA); and the California Geographic Information Association (CGIA).  The conference rotates its location around the state every year.  This year, the conference took place in Monterey, CA.

This is the 20th year of the conference, and with that being the case, the organizing committee decided to change the format slightly.  In past years, the conference had the standard opening and closing keynote speakers, and then paper presentations split into various tracks to fill the remainder of the time.  This year, as a means of bringing more attention to the utility of this conference, the format was changed to include a series of town-hall meetings and panel discussions on the first day and then a reduced number of paper presentations in tracks on the second day.

The presenters included Mike Migurski from Code for America, David Thau from Google, Eric Gunderson from Mapbox, and Jack Dangermond from Esri.  There was also a panel discussion that included Mike Migurski, and David Thau, as well as Scott Gregory, Geospatial Information Officer (GIO) for California, Alex Barth from Mapbox, Ragi Burhum from Amigocloud, Jeff Johnson from Boundless Geo, Chris Thomas from Esri and Mark Greninger, GIO for Los Angeles County.

The fact that such a range of speakers were able to be brought together for this conference was really impressive.  They all had unique and in a number of cases, complementary viewpoints regarding data, the openness of data, the future of GIS software, both for consumers, and on the data processing end, and the future of GIS as a profession.  There were also some clashing viewpoints, but I think that points to the utility of this conference in that everyone was able to speak their piece, and thus added to the greater sum of the experience.  I took a number of points over the course of the conference, and want to expand on them here.

  1. There was a distinct lack of women as keynotes and presenting:  This was brought up to me by at least one person prior to the conference who looked over the program.  It was also mentioned obliquely by Eric Gunderson from Mapbox on his last slide which was a notice that they are expanding and are looking for and encouraging women who want to be/are in the geospatial industry.  This is something that definitely needs to be addressed in future conferences, though I don’t know that it can be directly handled, but has to have strong encouragement.  There has to be an effort from the planning committee to find interesting and engaging speakers for keynotes, whether they be male or female.  The other side of the coin is that people who think they have something that would be of interest to the greater GIS population, need to stand up, wave their hands around and make themselves heard.  I know that the relative mix of speakers changes year to year based on the makeup of the organizing committee, but more of an effort can and should definitely be made.
  2. Open data is the future:  A number of the presentations focused on the benefits that come about with the availability of open data.  I think the general takeaway is that it has become very easy and cost effective to bring together tools for various types of analysis of data.  There are also a lot of different people with the skills to build these tools a lot more easily.  This means that there are more business markets being opened with the analysis products that are being developed.  There is also a lot of research being done through schools, non-profits and other groups.  All of these groups need data to work with.  The government has the best and easiest ability to gather large amounts of data and keep it organized.  Where they don’t necessarily do so well, is developing applications for providing it out to the public to use.  A key example used was that of Open Street Map.  This was initially developed in response to the Ordnance Survey data in the UK that is very heavily copyright protected.  It was prohibitively expensive for people to use, and with the new mobile tools available, it was a natural follow-on to create a new crowd-sourced dataset.  This brings up the conundrum of where would the OSM data, and resultant businesses growing around it, be, had the Ordnance Survey been available to the public?  There have been some very significant data collection efforts, like mapping roads in Haiti after the earthquake, that were only possible at the time because of the OSM tools that were already developed.There was a panel discussion with a lot of discussion about data.  An interesting point was that for all the standards for putting out web services for data, there is a desire for the data to simply be available for download, in as simple a format as possible.  A direct request was for shapefiles and csv files as necessary.  There is of course a flipside to putting this data out to the public.  There is some data that should not be released due to the sensitivity of the information, whether for security of physical locations, or due to the inclusion of personal identifiable information.  There was definitely a push and pull between the companies developing the tools and applications, and the public agencies in this regard.  The companies want the data available, and feel that it should be made so, and each person or group that consumes it would be responsible for the consequences individually.  The public agencies see the need to have a set of standards for what can and cannot be shared first, and then release layers that adhere to those standards.  In the end, I think it is all positive in that more and more data is being made available to the public without restriction for use.
  3. There were a number of discussions about the future of the CalGIS conference.  These were interesting because I think they reflect a larger context of the part that professional groups and conferences play in the GIS arena.  As I noted before, the conference this year had more of a focus on town-hall meetings and open format discussions, as opposed to all paper presentations.  I think it was very useful to have this forum for getting these groups of people together to start talking about these issues, when they normally wouldn’t have a chance to be in the same room.  This in general, seems to be the place that the professional groups like URISA, can occupy, providing a connection between the public sector and the private contractors who can do the projects for them.  There is also the opportunity to get feedback from both public and private sector on what they need and are looking for from people coming out of school, either with a 4 year degree or a GIS certificate of some sort.  The needs are not simply a matter of the degree or certificate, but also the skills that someone should be looking to acquire, including programming and scripting, ability to use GPS devices, understanding of spatial analysis, etc.  This is good information for educators to have because I think in many cases there is a disconnect between what is offered for a GIS certificate or degree, and the skills that are needed to be successful in the workplace.  Having those needs be incorporated into the curricula is going to help not only new people in the industry, but also those who are looking to improve their skills.
  4. The end of the conference had two keynotes, one from Eric Gunderson of Mapbox, and Jack Dangermond from Esri.  It would have been difficult to find two more completely different points of view related to GIS.  Mapbox is kind of turning the GIS processing world on its head with their use of Amazon cloud services to massively scale up processing datasets.  Their plan is to roll out as many fast, scalable datasets as they possibly can.  The amount of data becoming available is truly massive, and the challenge is to be able to deal with it all and figure out how to analyze it.  Jack’s presentation was pretty impressive in summarizing the entire ArcGIS ecosystem.  The push toward web apps and ArcGIS Online is huge.  There is also at least an effort to embrace open data, and a form of “open source”, that is uniquely Esri.  Jack’s marketing folk had him on point and he was in very good form.

I am always torn when I see an Esri presentation, especially when compared to a lot of the other presentations that were given at the conference.  I’ve used Esri products for years, like many people, and there are things that I do in ArcGIS that are really not simple to do in other packages.  At the same time, what I continually see, and what unfortunately was evident at CalGIS, was a complete blindness on Esri’s part, to the existence of any other source of spatial data, storage format for spatial data, or software for working with spatial data.  Honestly, when Jack spoke, it was like Eric had not just spoken before him, and in fact, didn’t even exist.  Now, don’t get me wrong, Esri has every right to push their product, and by incorporating more and more aspects of spatial analysis and cartography into their software, it increases the amount of profit they can make through software sales and maintenance.  That is just good business.  At this point, there aren’t even any software packages close to competing with them on a large scale.  I wonder though, if in the future, having the blinders on to any other innovation out there, could hurt Esri.  If their standard marketing statement is that you have to use their software, use their data storage formats, and use their data schemas, what happens to those people who cannot afford some or all of those components?  For example, what if you want to store your spatial data in a Postgresql/Postgis database?  This would allow you to interact with many other software packages, both web based and otherwise.  It is definitely robust, well supported, and standards based.  Seems perfect.  The problem is that you cannot load layers from there into ArcGIS and edit them, unless you have ArcSDE on top.  This means that in order to edit these layers, I either have to do some sort of synchronization with an Esri supported format, edit from there in ArcGIS and the synchronize back, or find another software package for editing directly in PostGIS.  This pushes people away from the Esri universe because it simply makes it either too difficult, or too cost prohibitive to use all Esri products.  Supporting open spatial data standards for not just cartography, but also editing and analysis, would seem to help increase the potential Esri user base, as opposed to shrinking it.  I’m sure this debate will continue on, and it will be interesting to see how ArcGIS evolves in the future.

All in all, I think CalGIS 2014 was a really good conference, and succeeded in changing the game for the 20th year.  There is as much of a need for this conference as there is for the Esri User Conference every year.  People need a chance to work with the software experts at Esri, and see how it is being used out in the industry.  At the same time, CalGIS provides the opportunity to expand the discussion to the larger context of GIS not constrained by a particular software package.  With the speed of innovation in GIS software, and web-based mapping, and the explosion of available datasets, this discussion becomes even more important.  I have a soft spot for CalGIS as well because I’ve been on the planning committee for a number of years, and have met some amazingly talented, innovative people there that I am happy to call my friends.  In short, I can’t wait for next year.  Hope you will be able to join us in Sacramento!

Leave a Comment