The November/December issue of Earth Imaging Journal ran an article titled “Justifying the Cost of Authoritative Imagery in a World of Free Data“. This article was based, in part, on a recent NSGIC publication titled “Justifying the Cost of Authoritative Imagery….a brief review of the issues“. The NSGIC publication was developed as a result of discussions at the State Caucus Meeting during the NSGIC Annual Meeting in Orlando Florida.
Better cities can be created by giving citizens creative access to information. A new project is being launched in Europe to create Smart City applications and transfer them from city to city. The project will be using an open source service developer toolkit to help make it easier for developers to create new and innovative applications that work across the continent. The project is called CitySDK, City System Development Kit. The €6.84 million project is funded in part by a grant from the European Commission.
The basic idea is to open city data resources to developers allowing them to create useful applications, much like Washington DC did when it conducted its Apps for Democracy competition. Unlike Washington DC, CitySDK wants the same app to run in many cities across Europe. Pilot projects will be launched in three cities starting in early 2013: Amsterdam, Helsinki, and Lisbon. Five other cities are also participating: Barcelona, Istanbul, Lamia (Greece), Manchester, and Rome. This effort will require cities releasing their data in standard formats.
Europeans see this approach benefitting citizens who use the applications, cities who get apps written for them, and developers who benefit from expanded markets. The expectation is that the more nimble smaller firms will take advantage of this opportunity, what Europeans call SMEs – small and medium enterprises. In their documentation, they say CitySDK will provide developers with a “European advantage against the US and Asian competitors.”
Note: CitySDK was identified in a special report on Technology and Geography in the October 27 issue of The Economist. The over-riding theme of the report is that geography is enhanced by smart phones and connectivity, not diminished. Smart cities take advantage of technology by helping people navigate, find interesting places, and even report problems that can be fixed quickly.
-updated 10/25/12 2:17pm-
Whether it be emergency medical response, the delivery of goods and services, tourists trying to find a hotel or restaurant, preparations for the census, or local crime analysis, the ability to find addresses is critical to public safety, economic efficiency, and government operations. The need to locate addresses accurately is growing at all levels, including of course, at the global level from within Google, Bing, Apple, Mapquest, and other mapping engines.
The process of assigning new, or modifying existing addresses is owned and managed by local government Address Authorities. This should not change as trusted local expertise is closest, the most connected, and likely the most vested to getting the information correct. And, many redundant and sometimes competing efforts are made to compile an inventory of official and unofficial physical addresses. The public and private sectors would both benefit greatly if local Address Authorities would extend their current responsibilities to provide a publicly-available inventory of physical addresses and their locations in geographic coordinates.
What follows is a suggestion for best practices that would make the most of local addressing knowledge, for the benefit of local communities, state, federal government, and private sector enterprise.
This is just a strawman and suggestions for any improvements to this proposal — in the form of additions, modifications, or simplifying/enhancing the concepts and related messaging — are encouraged.
BSA — Basic Street Address. Consists of a house number, fully qualified street name including prefix and suffix directionals, street type, and the address reference system (a name associated with the addressing area, not a zipcode) that the address is found within. Building names on campus-like facilities should be incorporated into the BSA inventory.
BSA + Geo — The Basic Street Address plus geographic location coordinates, expressed in latitude, longitude or as an x,y coordinate pair in a recognized coordinate system. BSA+Geo can include several separate address point records for the same address, where appropriate, to represent access points, entrance points, and multiple structures.
Proposed Best Practices
- The local government Address Authority is responsible for maintaining an inventory of official and unofficial BSA information.
- The BSA information is kept together with a coordinate pair reflecting, at minimum, a 2D position of the address in geographic space. (Unit numbers, aka sub addresses, are also worth inventorying, but introduce sufficient complexity that they are not covered in this proposal).
- State and local authorities should have an understanding of the locally-pertinent elements of the FGDC Addressing Standard, Census Address Data Submission Guidelines and forthcoming NENA 911 GIS data model and should consider the ability to translate their local address data into these data standards.
- The geographic location(s) for each BSA should be accurate enough to guide emergency responders to the desired location without ambiguity with respect to ‘which structure’ and ‘from which road’.
- The inventory of BSA+Geo is maintained through a separate business process and does not include any resident names or other personal information.
- For new addresses, the BSA+Geo, should be updated as soon as a building permits are issued for new construction to accommodate for deliveries, inspections, accidents, etc.
- The BSA+Geo inventory is maintained locally by a stewarding agency that is clearly identified and one that coordinates well with public safety operations.
- The BSA+Geo inventory is collected and maintained with minimal redundant resources.
- The BSA+Geo inventory should consist of all physical addresses but could be linked to corresponding mailable addresses where this is desired.
- The inventory of BSA+Geo is public information*, and is actively shared via web endpoints (data file and service URLs).
- Incentivizing states, where they are willing, to be aggregators of address data, from the local address authorities into regional, statewide, and/or nationwide data resources, is a logical approach that seems promising.
- Web-based feedback channels exist to get issues with the BSA+Geo content to the local data steward to be adjudicated and acted upon where appropriate.
- ‘Time to market’ for changes to BSA+Geo is measured in hours or days (ie an address inventory is a continual, not a periodic, activity).
- Standardized metrics are compiled and published openly that track the completeness, accuracy, and currency of the BSA+Geo data content.
The Census Bureau and US Postal Service both currently attempt to keep national address inventories. The Census Bureau does not focus on business addresses and its primary need is in preparing for the decennial census every 10 years. The USPS has an inventory of mail delivery points which includes only businesses and residences served by street delivery mail.
Unfortunately, neither agency shares their address inventories citing Federal laws that pertain to either a) information collected in conducting (not preparing for) the census survey (Title 13, Sections 9 & 214) or b) a prohibition of identifying addresses of a specific “postal patron” (Title 39, Section 412a). Take out any direct association of addresses with names of residents/patrons and the current interpretations of the laws by Census and USPS seems somewhat misguided.
With this in mind, perhaps an additional best practice would be the adoption a policy statement similar to the one shown below.
An address point record (APR) consists exclusively of the following digital information components:
- A descriptive street address, in a standardized format;
- addressing zone information (zipcode, city, and/or addressing authority) that denotes a specific area within which the address is located; and
- a numeric coordinate pair (latitude-longitude or equivalent) that represents the geographic location associated with the address.
An APR should be considered public information when:
- Local, tribal, regional, or state government has recognized the address through a formal process or for purposes of services delivery (utilities, emergency response, etc); and
- the APR is not provisioned with additional descriptive information formally classified as protected, private, or sensitive.
A panel discussion about the National Broadband Map was held at the Woodrow Wilson Center in Washington, D.C. on October 15th, 2012 (Case Study report here). The discussion was also webcasted. Panel members included:
- Michael Byrne, GIO for the Federal Communications Commission (FCC);
- Zach Bastian, a recent JD graduate who wrote the Case Study report;
- Haley Van Dyck, lead of the Digital Government Strategy, and e-government policy analyst for the Office of Management and Budget;
- Greg Elin, Chief Data Officer at the FCC;
- Ben Balter, Presidential Innovation Fellow who recently wrote Towards a More Agile Government; and
- Dr. Sean Gorman, Chief Strategist for ESRI’s DC Development Center.
The gist of the presentation was to highlight how well the broadband initiative worked. It is, as Zach Bastian put it, “the poster child for government and IT collaboration.” He also highlighted four main wins that developed from this project.
- Incredible savings (specifics seen in the report).
- Agile development, which is an iterative process. Features were added as needed instead of the usual governmental “waterfall” methodology where all specs are listed in advance.
- Open data, participation (i.e. speed testing by citizens, using mobile applications provided by FCC), transparency, and collaboration by NTIA, states, and internet providers, as well as NSGIC. Open data provided an “Aha!” moment because even though the government had to give up a bit of control, it meant that their systems, points, and work spread much faster than they could have imagined.
- Tangible effect on policy. This was projected helped to address the digital divide.
After this introduction, the panel was asked a series of questions. The first question, “What can government do to help modernize?” lead to some great insights:
- *Build tools to accomplish culture change. You can’t just tell people about change, you need to provide the tools to do so.
- *Lead by example. Seeing others having impact, success leads to more people joining in.
- *Hire people with new mindsets.
- *Encourage risk to learn from failure, even in government.
- *Innovation change is hard and trying to think its easy makes it harder.
- *Find opportunities to move 1 or 2 people to do things differently. Even small numbers is good.
- *“Everyone has to learn to swim for themselves” –Greg Elin. Meaning that you may have to explain and help each person climb on board with these new ideas. Be patient! Along these lines, famous physicist Max Planck’s famous quote, “Truth never triumphs—its opponents just die out. Science advances one funeral at a time,” was recalled.
In addition to these points, the National Broadband Map serves as a great example of how data can come alive for people when attention is paid to design as well as factually correct information. With such a combination, the realization of peoples’ ideas can be very powerful.
Last May I wrote about the European March Toward Open Data. A Finnish study had found 15% better economic performance in countries with open data policies compared to those trying to recover their costs. Based on that finding, Finland had begun making their geodata available free of charge to all users.
I neglected an important issue. The growth came from small and medium enterprises (SMEs). Lower costs allow SMEs to develop new GI-based products and services. Large firms, able to pay full fees, didn’t do anything different when data became cheaper. See Does Marginal Cost Pricing of Public Sector Information Spur Firm Growth?
Tom Roff (FHWA) and Steve Lewis (USDOT GIO) celebrated the culmination of joint work with NSGIC by announcing President Obama’s signing of Public Law 112-441, which authorizes the use of federal funds for the production of statewide centerline data and attribution for all paved and unpaved public roads. NSGIC’s Transportation for the Nation initiative (TFTN) has been working for a national road centerline dataset for many years, and this law provides the resources and driver for states to now collect all roads data within their boundaries and does not require the usual 20% state funding match for the effort. Planning for the data collection will begin in October, 2012, and the first data reporting will begin in 2014. Once again, NSGIC has provided significant national value encouraging the efficient and effective use of geospatial technology.
- Ed Arabas
Our keynote speaker, Bob Austin, PhD, of the City of Tampa, gave a presentation on four views of GIS:
- Personal View, where the main take away is that it’s all about the data. His advice is to document requirements (the ‘How’ will come later), and to never sacrifice “Good” in favor of “Fast” and “Cheap”.
- Local View, using the City of Tampa’s view of the challenges of information access. Specific challenges being Data access, Naming, Policy and Usability. A major takeaway from this view is that the policy and data sharing consume just as much time in a project as the technical components. His advice is to plan ahead for the sociological and policy aspects in project planning.
- Industry View, using his experiences with GITA (Geospatial Information & Technology Association) as the example. 85% of US Infrastructure is owned by industry, and understanding infrastructure interdependencies is key.
- National View, using his service with NGAC (National Geospatial Advisory Committee). He emphasized building once, and using many times by taking a portfolio approach. Sharing data makes sense. Check out the work they have done at www.geoplatform.gov
- A bonus 5th view is the International View, citing several examples that emphasize interoperability and the need for international standards.
He also shared the City of Tampa’s experience with the Republican National Convention and the 3 major concerns that they planned for:
- 1st concern was Hurricanes; in reality Tropical Storm Isaac prompted the cancellation of the first day of the convention.
- 2nd concern was Terrorism; in reality they did not have any terrorist attacks.
- 3rd concern was Violent Protests; in reality there were some non-violent protests, but nothing violent. There were only 2 arrests during the course of the convention.
To support the convention they stood up a situational awareness dashboard called TIGER (Tampa Information and Geographical Resources) that now has upwards of 185 data sets.
It was a great presentation to get insight into some of the different perspectives of GIS. As Bob stated, “GIS is not just a good idea, it’s inevitable”.
The opening day of the annual conference began with an enlightening welcome from Florida’s state GIS coordinator, Richard Butgereit. He shared a couple of interesting facts about Florida. Between May and August, Florida has gone from a state of extreme drought to very wet due in part to 3 tropical storms and increased frequency of severe storms. Richard’s advice – if it thunders, run for cover!
We then had the pleasure of receiving a presentation from the State Archaeologist of Florida, James Miller, PHD, LLC, who shared examples of how GIS has made a profound difference in how they conduct their research and preservation efforts. He explained that GIS is the most powerful tool he has come across to help with his work and explain the results of his work. Examples included research into the Gainesville Depot, which they discovered had been moved 3 times since the 1800’s. Efforts are nearing completion to move the depot to its original location and restore it to become a useful building again. The second example was Heritage Park in the Bahamas. Their advice to the creators of the park – you don’t need to dig, you need a plan. Their research helped identify points of interest and the ideal location for the park using a combination of physical documents and GIS analysis. Finally, a highlight was the story of their research of Freetown in the Bahamas. Their research combined historic imagery, physical documents and interviews with former residents, including the Cooper family, to document what the town looked like in the past. They were able to identify wells, community centers, grave sites, and get a better understanding of the physical and cultural characteristics of the town. What an honor it was to hear about this research, and great examples of how GIS analysis can help preserve our past!
The NSGIC 2012 Annual conference kicked off today with a very informative and dynamic workshop facilitated by Sanborn. The workshop began with a presentation on sensor technologies, where they are today and where they are heading in the future. The presentation quickly took on the feel of an extended state caucus with an exchange of questions, answers and discussion on topics ranging from defining data classifications, quality control methodology, RFP’s and uses for 3D point clouds.
Here are some highlights:
- Suggestions for NSGIC to develop a common RFP “set” for procurement, that all states can use
- Several things drive up the cost of acquisition procurement, including: forced use of specific technologies; not allowing the experts to guide the process with creative options and lack of clarity regarding what is being requested.
- As cloud solutions become available, procurement will change for the variety of end-users and service levels, so a lot of new things to think about.
- A suggestion was made for Sanborn and NSGIC to develop a 1-pager with sample pictures for state reps to use as a handout to answer the question “why buy imagery when I can use free online resources?”
NSGIC thanks Brad Arshat, Sanchit Agarwal and Learon Dalby, they know their stuff!! Please contact them via Twitter @SanbornMap or email@example.com with any questions.
In true NSGIC style a discussion group met to talk about GIS as IT until about 10:45. Look for some useful tools born from this discussion to help market GIS in the midst of state leadership. Hats off to Danielle Ayan of Georgia for leading this discussion!
Looking forward to a very full day tomorrow including keynote speaker Bob Austin from the City of Tampa, LandSat and You, and the Corporate Sponsor Reception and Buffet!
NSGIC’s 2012 Annual Conference kicks off less than a week from today. For all who make the investments in time and money to participate, the ROI is typically very high. At least that’s what members tell us in the post-Conference evaluations and comments.
To those who have never attended or are attending for the first time, there may be concerns about the time that will be spent out of the office and the costs of travel/registration, and how those expenses will be justified. Although we know it’s there, NSGIC didn’t explain its Conference “ROI” very well or help prospective attendees justify their Conference expenses… until now.
Thanks to the efforts of several NSGIC volunteers, led by Membership Committee Chair Leland Pierce (NM), NSGIC has tools that help fill this need. “Attending NSGIC’s Annual Conference
is Worth Every Penny” is a two-page guide to identifying and quantifying the benefits of NSGIC Conference attendance. It provides helpful details on how our Conferences are organized, how NSGIC controls costs, and provides Conference Grants to help State representatives fund their participation. Read the document at http://www.nsgic.org/public_resources/Conference_ROI_060512_Final.pdfand be sure to visit the link for developing a conference attendance justification document to prove the value of attending NSGIC Conferences.