Promoting Advancement in Surveying and Mapping
Crisis map mashups Modernizing the NSRS Semantic web Weather vs Climate
National LiDAR Center moves to Houston Interactive georeferencing
a a g s , c a g i s , g l i s , n s p s S ERVE TO PROMOTE THE INTERESTS OF GEODESISTS, CARTOGRAPHERS, GIS EXPERTS & SURVEYORS
ACSM Bulletin ISSN 0747-9417 Editor Ilse Genovese Publisher Curtis W. Sumner ACSM BULLETIN
The official professional magazine of AAGS, CaGIS, GLIS, and NSPS
American Association for Geodetic Surveying (AAGS): Barbara S. Littell (president), Curtis L. Smith (president-elect), Michael L. Dennis (vice president), Ronnie Taylor (immediate past president), Daniel J. Martin (treasurer). Directors: Edward E. Carlson, Karen Meckel [www.aagsmo.org, 240.632.8943] Cartography and Geographic Information
Society (CaGIS): Scott Freundschuh (president), Kari J. Craun [president-elect], Terri Slocum (vice president), Alan Mikuni (immediate past president), Kirk Eby (treasurer). Directors: Gregory Allord, Jean McKendry, Robert M. Edsall, Michael P. Finn, R. Maxwell Baber, Sarah Battersby, Charley Frye
Geographic and Land Information Society (GLIS): J. Peter Borbas (president), Coleen M. Johnson (vice president), Robert L. Young (immediate past president), Stacey Duane Lyle (treasurer), William M. (Bill) Coleman (secretary). Directors: David R. Doyle, Bruce Hedquist, Francis W. Derby, Joshua Greenfeld [www.glismo.org, 240.632.9700] National Society of Professional Surveyors (NSPS): A. Wayne Harrison (president), William R. Coleman (president-elect), Robert Dahn (vice president), John R. Fenn (secretary/treasurer), John D. Matonich (immediate past president), Patrick A. Smith (chairman, Board of Governors), J. Anthony Cavell (secretary, Board of Governors). Directors: Stephen Gould (Area 1); Lewis H. Conley (Area 2); Joe H. Baird (Area 3); Wayne A. Hebert (Area 4); Jan S. Fokens (Area 5); Larry Graham (Area 6); Jeffrey B. Jones (Area 7); Henry Kuehlem (Area 8); Carl R. CdeBaca (Area 9); Timothy A. Kent (Area 10) [www.nspsmo.org, 240.632.8950] ACSM Congress: Jerry Goodson (chair, NSPS). AAGS delegates: Daniel J. Martin (chair-elect), Steve Briggs, Wes Parks (alternate). CAGIS delegates: Doug Vandergraft, Alan Mikuni, Aileen Buckley (alternate). GLIS delegates: Joshua Greenfeld, John Bean, Robert Young (alternate). NSPS delegates: John Matonich (treasurer), Rich Barr (alternate), John Fenn (alternate). John Swan (NSPS Foundation representative, associate member); Patrick Kalen (Council of Sections Representative, associate member), John Hohol (Sustaining Member Council representative), Curtis W. Sumner (secretary, ACSM Executive Director) Editor Ilse Genovese 6 Montgomery Village Avenue, Suite 403, Gaithersburg, MD 20879. Ph: 240.632.9716, ext. 109. Fax: 240.632.1321. E-mail: <email@example.com>. URL: www.webmazine.org
© 2008 American Congress on Surveying and Mapping. The magazine assumes no liability for any statements made or opinions expressed in articles, advertisements, or other portions of this publication. The appearance of advertising in the ACSM Bulletin does not imply endorsement or warranty by the ACSM Congress of advertisers or their products.
t h e a c s m b u l letin is an official publication of
American Association for Geodetic Surveying
C Agi s
Cartography and Geographic Information Society
GL I S
Geographic and Land Information Society
National Society of Professional S urveyors
N S P S F o u n d a t i o n,
NSPS Foundation, Inc
ACSM Sustaining Members Autodesk, Inc. ♦ Berntsen International Blueline Geo ♦ Earl Dudley Associates ♦ ESRI First American Data Tree ♦ Hugo Reed & Associates Leica Geosystems ♦ LIS Survey Technologies Corporation Magellan ♦ NOAA , National Geodetic Survey Professional Publications, Inc ♦ Reed Busin ess – GEO Roadway ♦ Robert Bosch Tool Corporation Schonstedt Instrument Co ♦ SECO Man ufacturing Sidney B. Bowne, LLP ♦ Sokkia C orporation Surv-Kap, Inc ♦ Topcon Positioning Systems Trimble Navigation USDI Bureau of Land Management/Cadastral Survey USDI Fish & Wildlife Service US DI Minerals Management Service Victor O. Schinnerer & Company
june 2010 ACSM BULLETIN
ort st, and mn on h.net>
A C S M
Among our contributors
S O P H I A B. L I U (“Crisis Map Mashups in a Participatory Age,” p.10), is a PhD candidate at the University of Colorado at Boulder. <Sophia.Liu@ coloradu.edu> A N A H I A Y A T A I A C U C C I (“Crisis Map Mashups in a Participatory Age,” p. 10), Columbia University, School of International and Public Affairs; Director, Ushahidi Chile@SIPA, and Crisis Mapping Coordinator <firstname.lastname@example.org> R E N E E
S H I E L D S
(“NOAA to Modernize NSRS,” p. 23) is Height Modernization Manager with Department of Commerce’s NGS/NOAA <Renee.Shields@noaa.gov>
w w w. w e b m a z i n e . o r g
THE PUBLISHER: The American Congress on Surveying and Mapping (ACSM) and its
member organizations—AAGS, CaGIS, GLIS, NSPS, and NSPS Foundation, Inc.; Sustaining Members; and Associate Councils. EDITORIAL POLICY: The American Congress on Surveying and Mapping publishes the ACSM Bulletin to provide current scientific, technical and management information in the fields of surveying, cartography, geodesy, GIS, and photogrammetry, and to communicate news on developments in the geospatial data industry of interest to the member organizations of ACSM. ACSM is not responsible for any statements made or opinions expressed in articles, advertisements, or other portions of this publication. The appearance of advertising in the ACSM Bulletin does not imply endorsement or warranty by ACSM of advertisers or their products. Submit articles, press releases, and all other matter for consideration for publication to Ilse Genovese, Editor, ACSM Bulletin, 6 Montgomery Village Ave., Suite 403, Gaithersburg, MD 20879. E-mail: <email@example.com>. Phone: 240/632-9716, ext. 109; Fax: 240/632-1321. RESTRICTIONS AND PERMISSIONS: Articles to which ACSM does not own rights are so identified at the end of the article. Permission to photocopy for internal or personal use may be obtained by libraries and other users who register with the Copyright Clearance Center (CCC) by paying $2.50 per copy per article directly to CCC, 222 Rosewood Dr., Danvers, MA 01923. [Fee Code: 07479417/97 $2.50. © 2010 American Congress on Surveying and Mapping.] This consent does not extend to copying for general distribution, advertising or promotional purposes, creating new collective works, or resale. Other requests to photocopy or otherwise reproduce material in this magazine should be addressed to the Editor, ACSM, 6 Montgomery Village Avenue, Suite 403, Gaithersburg, MD 20879. Phone: 240/ 632-9716, ext. 109. Fax: 240/ 632-1321. CIRCULATION AND COPYRIGHT: ACSM Bulletin (ISSN 0747-9417) is published
(“NOAA to Modernize NSRS,” p. 23; white paper on “Improving the NSRS,” p. 24) is Chief Geodesist with Department of Commerce’s NGS/NOAA <Dru.Smith@noaa.gov> D R U
S M I T H
E R I C
B. W O L F
(“Introduction to the Semantic Web” column, p. 25), is USGS cartographer and PhD student at CU-Boulder. <ebwolf@ gmail.com>
M I K E D A N A
(“Georeferencing with an Interactive Pen Display,” p. 15), is Business Development Manager with Wacom <firstname.lastname@example.org>
4 ACSM BULLETIN june 2010
bimonthly—February, April, June, August, October, and December—by the American Congress on Surveying and Mapping (ACSM), 6 Montgomery Village Avenue, Suite 403, Gaithersburg, MD 20879. Copyright 2010 American Congress on Surveying and Mapping. Periodicals postage paid at Gaithersburg, Md., and additional mailing offices. Postmaster: Send address changes to ACSM Bulletin, Member Services Department, 6 Montgomery Village Avenue, Suite 403, Gaithersburg, MD 20879. MEMBERSHIP INQUIRIES/CHANGE OF ADDRESS: Membership Coordinator, 6, Montgomery Village Ave., Suite 403, Gaithersburg, MD 20879. Ph: 240.632.9716 ext. 105. Fax: 240.632.1321. E-mail: <email@example.com>. URL: www.acsm. net/membership.html SUBSCRIPTIONS: The 2010 subscription rate for the printed publication is $100 (U.S. addresses) or $115 (foreign addresses). Subscription rates for the online version are—online only: $100 (U.S. and International); online and print: $126 (U.S.) or $140 (International). Single copies are sold to non-members at $8 per copy, plus handling and postage. Membership dues include an annual subscription to the ACSM Bulletin ($40), which is part of membership benefits and cannot be deducted from annual dues. Single copies are sold to members (U.S. and foreign) at $6 per copy, plus handling and postage. Subscriptions handled by
The Sheridan Press Subscriber Services: Ph. 717.632.3535 ext. 8188; Fax: 717.633.8920; E-mail: <firstname.lastname@example.org>. ADVERTISING:
Current advertising rates displayed at http://www.webmazine.org. Inquiries: John D. Hohol, 608.358.6511. E-mail: <email@example.com>. PRINTED BY: The Sheridan Press, 450 Fame Av., Hannover, PA 17331. COVER DESIGN BY: Phil Wolfe Graphic Design, Hanover, Pennsylvania.
A C S M
Bulletin june 2010 no. 245
Crisis Mapping 10 Crisis map mashups in a participatory age — By Sophia B. Liu and Anahi Ayala Iacucci
Geospatial Careers 16 Where are the jobs?— By Ilse Genovese
Modernizing the NSRS
NOAA rolls out a plan for modernized NSRS at a Summit in May — By Renee Shields and Dru Smith
Introduction to the Semantic Web
Part one of a primer on Web 3.0 — By Eric Wolf
LiDAR Center Moves to Houston
National Center for Airborne Laser Mapping relocates — By Thomas Shea
Two- and Four-year Surveying Programs 40 The complex relationships in surveying education— By N.W.J. Hazelton
The Super-Information Highway 43 In Virginia, private–public project evaluates road assets — By Don Talend
Earth Science Has a Field Day
Surveying and mapping explained — By Stephen Letchford
official magazine of the american congress on surveying and mapping
A C S M
Around the Nation
Improving the National Spatial Reference System Spy vs spy on Facebook Does “measure,” “model,” and “map” apply to oil spills? Esri’s - demographics forecast Long before GPS, you had to have portolans Designing Tomorrow: America’s World Fairs of the s exhibition
The Fed Page
Tracking the Gulf oil spill
Modernizing NSRS The Tech Page
Georeferencing with an interactive pen display How can it be that we’re only scratching the surface of LiDAR’s potential?
Ask Dr. Map! Solar times, migration during a recession, and much more ALTA standards: New version The FIG Report (): The XXIV FIG Congress
Oil spill reveals the dangers of success
Designing geodatabases for transportation
Front Cover (bottom): Mashup of Hurricane Katrina in the Gulf of Mexico, August , - September , . o′.″ N, o′.″ W; elevation. 0 m.; eye altitude: . km. [@ Google. Image: @ TerraMetrics. @ INEGI. Data: SIO, NOAA, U.S. Navy, NGA, CEBCO.]
GR-3 The professional GNSS receiver.
Unmatched performance & versatility. From the time it was introduced three years ago, no other receiver has matched the GR-3. Only surveyors can appreciate its full range of features and productivity they create.
-Premium GNSS options—all satellites, all signals -Rugged design to take a 2 meter drop -Triple operation—base, RTK rover, or network rover -Triple communications—cellular, UHF radio, spread spectrum -Hot swappable batteries for continuous use -SIM card slot -Bluetooth
Powerful features for every job. Priced for every surveyor.
It’s time. topconpositioning.com/gr3
5/11/10 10:26:51 AM
A C S M
w w w. w e b m a z i n e . o r g
Memorial Day— —Presaging summer and sweltering in ninety-degree humidity, the 2010 Memorial Day in the Nation’s capital honored all those who sacrificed their lives in military duty with a concert on the Mall featuring Gary Sinise and Joe Mantegna, the annual “Pouring of the Waters” ceremony at the Navy Memorial, a parade in which all the services participated, and the Rolling Thunder ritual. The predecessor of Memorial Day was “Decoration Day” which was observed near the end of the American Civil War to honor Union solders who fought in the War. After the War, many communities set aside a day to commemorate their dead. One of the first communities to do so was the town of Sharpsburg near Antietam Battlefield in Maryland. In the South, organized women’s groups were decorating graves before the end of the Civil War, as evidenced by a hymn published in 1867, “Kneel Where Our Loves Are Sleeping” by Nella L. Sweet. The community observances across the U.S. eventually coalesced around an observance which was extended after World War I to honor all those who made the highest sacrifice so that the rest of us can live in peace and abundance. Two quotes which capture the meaning of this important observance: “Freedom of speech and freedom of action are meaningless without freedom to think. And there is no freedom of thought without doubt. — Bergen Evans The purpose of all war is ultimately peace. — Saint Augustine
ACSM BULLETIN june 2010
National Flag Day—
Established by a proclamation issued by President Harry Truman in 1949, the National Flag Day on June 14th commemorates the authorization of the “Stars and Stripes” as the official National symbol of the United States of America by Congress on June 14, 1777.
The Star Spangled Banner—When the smoke
cleared over Fort McHenry, Francis Scott Key, a lawyer from Georgetown involved in negotiating the release of prisoners held by the British, was so moved to see an American flag still waving over the Fort that he resolved to write a poem about it. “The Defence of Fort McHenry,” which he published in the Patriot on September 20, 1814, describes the soaring feeling of pride he felt for America and her defenders. Fit to the rhythms of composer John Stafford Smith’s “To Anacreon in Heaven,” which has better become known as “The Star Spangled Banner.” Under this name, the song was adopted as the American national anthem, first by an Executive Order from President Woodrow Wilson in 1916 and then by a Congressional resolution in 1931, signed by President Herbert Hoover.
—FROM THE EDITOR When people think of geography, they don’t often think of crisis management. Yet, the study of location applies to all sorts of issues that affect people and the environment. The two-months-old oil spill in the Gulf of Mexico is an unfortunate reminder of how important such knowledge is, and of something else: some problems elude fast technological fixes. The frustrations we have felt mounting with each unsuccessful attempt to “plug the hole” have been amply documented in the national papers and other media. In this issue, we focus on technology which works. Map mashups created by crowd sourcing have quietly revolutionized the way we share geographic and increasingly also socio-economic and political data collected locally about all sorts of occurrences. This is geospatial technology which works precisely because it combines communication and teamwork with data collection, data visualization, and data analysis. With mashups, LiDAR, RTK surveying, and an improved National Spatial Reference System, geospatial science and technology have left the laboratory for the real world of public and private enterprise. For individuals, their fast penetration into different walks of life has created diverse sources of employment. Geospatial job opportunities are growing and projections for the future by the Bureau of Labor Statistics are good—in a generally depressed labor market. Surveying and mapping expertise has been and will continue to be in demand; a myriad of projects large and small could not and will not be achieved without the creative, intelligent, and rigorous input of geodesists, land surveyors, cartographers, and GIS professionals. Our profession is at the heart of local and national economies, exploration, location-based responses to crisis events, and much more. Let’s continue to inform our contributions to society with new emerging science and technology and, in the process, positively affect the economic reality of our Nation and members of our profession. Ilse Genovese
2010 ACSM BULLETIN 9
Crisis Map Mashups in a
Mapping of crisis information is not new; however, neo-geographic t rise to new forms of crisis mapping. Map-based “web mashups” are GPS-enabled devices. Map mashups are typically web services whic then displayed in some geographic form—by Sophia B Liu and Ana
ong before the geospatial community got around to creating “mashups,” the concept of “composing” a new product by mashing different components was successfully used in such other domains as music. Map mashups entered the mainstream of cartography after Google introduced its public Google Maps API in 2005. Web 2.0 technology, which enables online social communication, mass collaboration, information sharing, and user-centered design, has made the creation and use of web mashups widespread. In this article, we explain why crisis map mashups are created and how they are often designed.1 As an example of how ubiquitous web mashups have evolved, we present a crisis mashup created in response to the 2010 Chile earthquake using the Ushahidi open source platform. Why crisis Mashups? Many mashup developers recognize that mapping crisis information is often more compelling and perceptible than reading text-based crisis reporting. A map of a crisis provides a clear and understandable spatial context and dynamic clustering of information by geo-location. Typically, map mashups are created to keep track of sensitive, rapidly changing crisis data and making this information more accessible and usable by formal responders and members of the general public. Some web developers create map mashups because they want to display crisis information on a map in real time, incorporating ephemeral crisis information from social media sites and the crowd in the impact zone of a crisis. One of the features which make some map mashups so popular is bi-directional communication. The flow of information from the field to the map and back again to the field is powered by web-based crisis mapping tools. Only the administrator of the map tool needs to have Internet connection. Compared to most professional mapping techniques, map mashups typically do not require extensive training to build.
Moreover, they are fast and robust enough to visualize different data types at multiple geographic scales, thus making crisis information accessible to multiple stakeholders in different ways.
A more in-depth analysis of these emergent neogeographic practices around map mashups in the crisis context can be found by Liu and Palen (2010) in the Cartography and Geographic Information Science journal.
n a Participatory Age
ographic tools and practices employed in recent disasters are giving hups” are one of these new forms, enabled by Google Maps and and ices which combine or “mash up” multiple sources of data which are u and Anahi Ayala Iacucci
from unverified, unknown crowds yields highly effective situational awareness visualizations of geo-referenced location information. This is because the “crowd” may provide more accurate data than the experts developing the map.
Another point to remember is that map mashups stand to greatly benefit from crowd-sourcing technologies. Mashing up data from reliable, verified sources with those coming 2
data and design of crisis Mashups Not only do crisis mashups differ in the type of data used to create them; the design used to display crisis information varies as well. data: Crisis mashups typically use publicly available scientific data, commercial and licensed data, data unearthed by professional and citizen journalists, and data posted on social media sites. The inclusion of user-generated content in mapping crises and the response to them has been facilitated by mobile information and communication technologies. The text messages people send via their wireless phones and the video, photos, or e-mails shared via Facebook, Twitter, and YouTube travel in near real time, across boundaries, to enhance our understanding of the world around us. Future map mashups will likely become more robust if valuable crisis information is published as open data in standardized formats. design: During a crisis, we need both spatial and temporal information to gain situational awareness of the crisis. Many crisis mashups are designed to communicate such information. However, some design features may represent spatiality and temporality simultaneously while others focus on either the spatial or just the temporal aspects of a crisis. For example, the 2007 San Diego wildfire2 and the 2009 Los Angeles wildfire3 mashups using Google My Maps require users to manually time-stamp and geo-code the crisis reports.
http://bit.ly/2007SanDiegoFireGoogleMyMap. 3 http://bit.ly/2009LAFireGoogleMyMap. june 2010 ACSM BULLETIN 11
The 2009 Los Angeles wildfire mashup [http://bit.ly/2009LAFireGoogleMyMap].
The Sea Level Rise mashup [http://www. mibazaar.com/ nationundersiege/]
Twitter map mashups, such as those created for the swine flu4 and the Iran protests5, use the geo-location information associated with the Twitter user’s profile 4
to automatically map a user’s tweets on the map mashup. Location information is likely to become more accurate as users opt-in to location-aware features.
http://www.mibazaar.com/swineflu; 5 http://www.mibazaar.com/irantweets.html.
ACSM BULLETIN june 2010
Most mashups map near real-time crisis information. However, some also provide historical crisis data, while others map potential crises.
Iran protests, 2009 [http://www.mibazaar.com/irantweets.html]
For example, the Extreme Ice Survey6 mashup (see pp. 10-11) presents the physical geographic changes of various glaciers by displaying timelapse videos of the glacial retreat on Google Earth over a multi-year period. In contrast, the Sea Level Rise on Coastal Cities in the U.S. mashup7 annotates a Google Map with projections of sea-level rise images from a coastal impact study report by a non-partisan, non-profit climate change study group. Given this ability of mashups to represent the potential futures, it is worth considering how map mashups can be used to engender a more long-term perspective around crisis response. ushahidi-chile When the 8.8 magnitude earthquake struck Chile on February 27, 2010,
Anahi Ayala Iacucci, along with facilitate the distributed coordination others, launched the Ushahidi- efforts for Ushahidi-Chile. Multiple Google Groups were set up Chile Situation Room Project at Columbia University’s School to broadcast instructions to the volunof International and Public teers and coordinate mapping activiAffairs (SIPA). Students from ties. Also, multiple Google Docs and SIPA’s New Media Task Force Spreadsheets were created to capture partnered in the project with and map geographically distributed a group of students who had information gleaned from media monibeen researching the Ushahidi8 toring. The Media Monitoring List for Ushaplatform (http://www.ushahidi. com) for a separate interna- hidi-Chile contained over 250 links to official governmental Twitter feeds tional development project. The Ushahidi-Chile mashup from Chile, Twitter users, Twitter lists, was set up by Patrick Meier, the Twitter Spanish-speaking experts, Director of Crisis Mapping and Spanish and English news sites, blogs, Strategic Partnerships at Ush- Facebook groups, and other relevant ahidi, a couple of hours after links that provided live updates on the the earthquake erupted and was Chile earthquake. Google Docs and Spreadsheets initially managed from the Situation Room at Tufts University. became an invaluable collaboration Within 48 hours, however, Ush- tool, facilitating ad-hoc data updatahidi-Chile9 became the respon- ing in real time. Skype Public Chats made possible instant communication sibility of the SIPA students. Assisted by more than 60 among the distributed Ushahidi volSIPA student volunteers trained unteer network. Through Skype, the to monitor social and traditional volunteers also had access to a tranmedia reports from Chile, Anahi scribed digital trace of their previous started working on the mashup communications. Facebook Groups and Events were on March 1, 2010. The students manually used to mobilize volunteers and inform mapped over 100 incidents—all during midterms week. Within two days, Usha- them of crisis mapping training and hidi-Chile had over 150 volunteers and other related activities. Later, a wiki in three days, they mapped 700 reports. was created to provide real-time To date, more than 1,200 Ushahidi-Chile updates of new projects and lessons learned, thus serving as a more comreports have been mapped. Several factors contributed to the plete instruction guide for old and new success of Ushahidi-Chile. The media volunteers. This wiki was also used monitoring and crisis mapping activi- to aggregate useful information about ties on Ushahidi-Chile were modeled on managing the platform and organizthose used for the Ushahidi-Haiti (http:// ing the workflow around an Ushahidi haiti.ushahidi.com/) instance developed instance. The “mash-mapping” of the Chile barely a month before. A trusted volunteer network was mobilized using pre- earthquake was a two-step process. existing social ties among students. Free The first step involved monitoring the cloud computing tools (i.e., Google media for relevant crisis reports from Groups, Google Docs and Forms, and the crowd. Then the volunteer stuSkype Public Chat) were used along dents identified the GPS coordinates with Facebook Groups and Events to for each report and geo-tagged the
http://www.extremeicesurvey.org. 7 http://www.mibazaar.com/nationundersiege/. 8 In the next issue of the ACSM Bulletin, we will take a closer look at this platform as a crisis mapping management tool. 9 http://chile.ushahidi.com/. 6
june 2010 ACSM BULLETIN 13
reports visually on the Ushahidi map. The result was a comprehensive and an up-to-date crisis map mashup of the 2010 Chile earthquake which was made available to respondents on the ground. The Ushahidi-Chile mashup has been found effective in rescuing earthquake victims, identifying the nearest hospitals, and delivering aid supplies. The mashup has increased understanding of the impact of this and other earthquakes on specific factors in our society. Equally important, it offered volunteer mappers a tangible way of contributing to the disaster relief efforts by helping to manage crisis information. lessons froM ushahidichile The wide participation in mapping the Chile earthquake and the rescue and relief response to it was made possible by the ability of the Ushahidi tools to collect information from SMS text messages via mobile phones and so provide people with a direct and accessible communication channel outside the mainstream media and government agencies. Future crisis mashups should also consider integrating social networking tools, cloud computing services such as Google Groups/Docs/Forms, and mobile technologies to harness the power of the crowd for timely and effective crisis response. The crisis mapping efforts around Ushahidi-Chile is, in some ways, a step towards the semantic web. The volunteers manually extracted meaning and value from online content, digital sources and SMS and then re-structured that information by giving it a title, a description, and a geo-location. This sometimes 10
ACSM BULLETIN june 2010
meant interpreting the message and the context in which it was created so as to provide a semantic and contextual pointof-view. Ushahidi’s Meier envisions developing a Mechanical Turk Service plug-in (similar to Swift River10) to facilitate what he calls “turk-sourcing.” This type of sourcing would disaggregate the time-consuming tasks of media monitoring and geo-location determination into human intelligence tasks. Our review of a myriad of crisis map mashups, and in particular the in-depth evolution of the Ushahidi-Chile map mashup, suggests that computing and communication technology have come together to create a powerful means for managing crises—the map built from data meshed from widely dispersed data.
Liu, S. B. and Palen, L. 2010. The New Cartographers: Crisis Map Mashups and the Emergence of Neogeographic Practice. Cartography and Geographic Information Science special issue: “New Directions in Hazards and Disaster Research” 37(1): 69-90. authors
Sophia B. Liu (Sophia.Liu@colorado.edu) is a PhD Candidate at University of Colorado at Boulder in an interdisciplinary program called Technology, Media and Society at the Alliance for Technology, Learning and Society Institute. Anahi Ayala Iacucci (anahi@crisismappers. net) received her Master of International Affairs in Human Rights and Humanitarian Affairs at Colombia University. She coordinates Ushahidi-Chile at SIPA and served as a consultant for the UN Iraq Inter-Agency Information and Analysis Unit on the use of the Ushahidi platform in Iraq.
Georeferencing with an interactive pen display —by Mike Dana, Business Development Manager, Wacom
very year at harvest time, a plane with a multi-spectral camera flies over the vineyards in Washington and Oregon, snapping images of 3,500 acres of grapes in Ste. Michelle Wine Estates. This aerial imagery helps vine makers to analyze grape canopy density prior to harvesting, because, ultimately, the quality of the wine they make depends on it. The aerial imagery is delivered in hundreds of 20-acre blocks and georeferenced snap vertices, or place points, drawing in ESRI’s ArcView software, using Wacom‘s directly on Wacom’s LCD display screen. The interactive pen provides a natural feel interactive pen display. and pixel-perfect accuracy as images are The entire process—from imaging to snapped into place. Smithyman has had georeferencing, mapping, and updating the pen for two years now; before, she geodatabases—must be completed in used a trackball, which was not only just one month, if the conditions of the tedious but lacked precision. vineyard is to be known in real time. “I can complete a typical georeferencing Given the short time frame, a rapid and project much faster with the Wacom pen accurate workflow is imperative. and monitor than I would with the trackA Wacom interactive pen display com- ball,” said Smithyman. “To georeference bined with ESRI software has proved to these images, I rubber-sheet them using be useful to Jennifer Smithyman, the ortho and our vineyard outline data. I winery’s Precision Ag. Specialist. The need 14 points to get each image georeWacom pen display enhances many of ferenced, and with the Wacom pen disthe features included in the ArcView play in my workflow, I can get a lot more accuracy. These performance benefits software suite. Smithyman moves quickly through are invaluable, especially right before editing sessions to select coordinates, harvest, our massive crunch time.”
The Wacom pen display’s precision is crucial for identifying differences in canopy density revealed when the aerial imagery is converted to the NDVI format (Normalized Difference Vegetation Index). NDVI displays the photosynthetic output of plants, based on their spectral bands in aerial imagery. “We’ve found a correlation between the size of the canopies and the grape flavors,” Smithyman says. “Wine makers have different preferences for grape flavor; some like grapes with more vegetal characteristics, others prefer more fruit flavors.” The Wacom pen pinpoints plants with different fruit flavor characteristics. So accurate is the georeferencing that the vineyard can be divided into different zones of grape flavour, which can then be picked separately. Smithyman (opposite) uses an interactive pen display to georeference digital aerial imagery and mark the flavor zones with remarkable precision. With the Wacom technology, she can create original content, edit maps, and analyze geographic information easily and efficiently. And, she can do it managing data on any of her multiple monitors. In addition to greater speed and precision, the Wacom interactive pen display provides ergonomic benefits. “My fingers don’t hurt anymore,” Smythyman says. “Originally, I had a mouse and was having some shoulder problems. So I switched to a trackball but since I’m constantly moving, my fingers were sore. When I use the Wacom pen display, I don’t have that problem.” With the Wacom pen technology, the use of georeferenced data by Ste. Michelle Wine Estates has increased steadily. The fact that field crews now use Trimble units to gather information year round to track frost problems, plan new vineyard sites, and identify pest infestations has helped as well. “The capabilities of the Wacom pen display to georeference aerial imagery have impressed everyone here,” Smithyman concludes. “My overall workload has grown. At harvest time in particular there’s no way I could get it all done so quickly using a standard monitor and a trackball. The Wacom pen display is a real lifesaver!” june 2010 ACSM BULLETIN 15
Where are the jobs? —by Ilse Genovese
Two-and-half years after the beginning of the current recession, job creation remains a national priority. But to replace the more than seven million jobs lost in the U.S., employers will need to add well over 300,000 jobs a month for four years in a row. There are very few periods in U.S. history when job growth has been that strong, said Harvard University labor economist Lawrence Katz. This notwithstanding, the U.S. economy is expected to create 15 million jobs by the end of this decade. Where will those jobs be? And, what kind of jobs will they be? bureau of labor statistics projections
In their report “The U.S. economy to 2018: From recession to recovery,” Ian Wyatt and Kathryn Byun, economists in BLS’ Division of Industry Employment Projections, estimate that real GDP growth will average 2.4 percent annually over the next decade. This is close to the 2.5 percent trend observed between 1998 and 2008. However, productivity growth is expected to slow by 2018, and people are likely to save more while spending less on personal items. The BLS projections show a different workforce. More people will have to work longer to retirement, and the pool of gainfully employed people will be ethnically more
ACSM BULLETIN june 2010
diverse. One-third of total job openings will be filled by job seekers with post-secondary degree. According to BLS, the U.S. economy will continue to be shaped as a predominantly services-based economy. “Changes in shares of employment over the decade will result from continuing increases in service-providing sectors, while goods-producing sectors (except agriculture), will lose employment,” noted Kristina Bartsch, chief of BLS’ Occupational Outlook Division, in the November 2009 issue of the Monthly Labor Review. Among the seventeen major industry sectors in the U.S., health care, financial services, and information are expected to expand. This outlook conforms with the prediction that the highest employment growth will occur in management, scientific, and technical consulting; computer systems design; and employment services. The major goods-producing sectors—mining, construction, and manufacturing, for instance—are, however, not likely to realize substantial employment growth despite some growth in output. With more than half of the new jobs in professional service occupations, economists like Katz worry about the “polarization of the labor market”—strong job growth for the high- and low-paying jobs but less growth in the middle of the labor market to replace the well paying manu-
facturing jobs the U.S. has been losing. outlook for geospatial professions
Bureau of Labor Statistics’ Occupational Outlook Handbook, 2010-11 predicts “favorable job prospects” for surveyors, cartographers, and photogrammetrists with a bachelor’s degree and strong technical skills. Employment of technicians and professionals in these fields is expected to grow by 19 percent from 2008 to 2018, which is “faster than the average” for all occupations. Increasing demand for fast, accurate, and complete geographic information is expected to fuel this job growth. A 19-percent change over the next decade translates to 174,500 employment opportunities for surveying and mapping professionals and technicians. Of this total, 81,800 will be positions held by professional surveyors, cartographers, and photogrammetrists, according to projection data in the National Employment Matrix. Employment for cartographers and photogrammetrists will change by 27 percent over the next decade, accounting for 3,300 new positions. The number of professional surveyors is expected to rise from 57,600 in 2008 to 66,200, a 15-percent change. Employment for surveying and mapping technicians will grow by
20 percent (or 15,700 jobs) to a total of 92,700 by 2018. Professionals and technicians involved in the development and application of GIS are expected to benefit most from new employment opportunities. Opportunities for traditional surveying services “tied strongly to construction activity,” may vary, depending on local economic conditions. However, because surveyors can work on many different types of projects, they are expected to have “steadier work than other workers when constructions slows.”
The occupational projections by BLS for surveyors can be compared with data and trends presented in POB’s 2010 Salary and Benefits Study (http://www. pobonline.com/Articles/Article_ Rotation/BNP_GUID_9-5-2006_A_ 10000000000000814058). The study was conducted by BNP Media’s Market Research Department and is based on a total of 553 returns to a questionnaire sent to 5,448 POB subscribers. The year 2009, which was the focus of the study, was a challenging year for many surveyors. Companies reduced spending, cut salaries, and used furloughs to stay afloat. Those that adapted and used new technology to deliver surveying and mapping services
in new markets have coped better with adverse economic conditions. A comparison by sector indicates that job opportunities for surveyors in the public sector were up by nine percent from the 2008 level, while employment in the private sector fell by seven percent during the same period. Demand for engineering design and road/infrastructure/transportation surveys increased by six percent each, but surveys conducted for real estate sales and construction projects experienced a slow-down. Small surveying companies (1024 employees) continued to struggle in 2009, which is reflected in the 67 percent decrease in full-time employment reported for these companies. Some respondents (31 percent) reported no changes in full-time employment at their companies, while others (8 percent) noted a slight increase. The biggest gains (24 percent) in full-time employment were realized by larger companies with between 250 and 499 employees. The POB 2010 study confirmed a steady trend toward higher education and increasing licensure. As many as 39 percent of respondents reported having a bachelor’s degree—an increase of five percent compared with 2008. The number of licensed professionals increased as well. In 2008, 70 percent of respondents were licensed surveyors;
COUNTY SURVEYOR Multnomah County’s Department of Community Services is seeking applicants for the County Surveyor. The County Surveyor plans, prioritizes, assigns, supervises, reviews and approves the work of staff involved in the surveying of county roads and property and public land survey corners. This position is responsible for the review and approval of land divisions (subdivision, condominium and partition plats); maintaining public survey records; determining topographic features for county road improvement projects, road right of way lines, and county property boundaries; reviewing and analyzing evidence to determine the location of public land survey corners; and giving advice and direction to property owners experiencing boundary disputes or issues. The successful candidate must exercise sound judgment and demonstrate thorough knowledge of practical and legal principles of boundary, cadastral, construction, county road and geodetic surveying, Federal and State laws governing the practice of boundary surveying and the establishment of Public Land Corners; the review and approval of subdivisions, partitions, and condominiums; principles, methods, and techniques of public administration; personnel management techniques; public sector budgeting procedures; and archival principals of permanent public records. Salary is $66,604.55 - $93,244.81/ annually. This recruitment is Open Until Filled; first screening date: June 4, 2010. Apply at www.multcojobs.org See job #9649-01 EOE june 2010 ACSM BULLETIN 17
this number rose to 74 percent gis and giscience in 2009, with most (86 percent) Thirty years ago, geographic being licensed as Registered Proinformation systems were only fessional Land Surveyor (RPLS), beginning to be deployed in Professional Land Surveyor (PLS), government agencies, the military Registered Land Surveyor (RLS), services, police departments, and Land Surveyor (LS). The private firms, and in higher number of Land Surveyors-in-Train- education. Today, as knowledge ing (LSIT) and Surveyors-in-Trainof place becomes ever more vital ing (SIT) increased by four percent, to a vast range of human activity, and there were incremental GIS and GIScience (see sidebar increases in GISP and Certified on this page) are central to the Photogrammetrist certifications. ways thousands of government The average gross salary fell agencies, private companies, seven percent, to $66,009, in and not-for-profit organizations 2009. Slow or no work was conduct business (Beyond the most often cited reason for Mapping: Meeting National Needs salary declines. The salary levels Through Enhanced Geographic reported for new hires differed Information Science, 2006). across sectors and companies, Besides GIS, the global positioning but title, licensure, experience, system (GPS), remote sensing, and and region all had an effect. other information technologies have The technology with most contributed to the changing nature impact on the surveyors’ role in of work in the mapping sciences the economy continues to be GPS. and in the professions, industries, Other technologies which can and institutions which use them make surveying businesses more for information management and competitive are laser scanning, delivery. LiDAR, GIS, total stations, RTK Yet, despite the deep penetration GPS, and 3D scanning. of the mapping sciences and Not surprisingly, when asked technology into many realms how surveyors can be successof daily life, “the supply of well ful in coming years, most said trained and well educated GIS/ that they should keep up with GIScience professionals has not new technology. However, many kept pace with the demand for respondents felt that higher educa- more robust geographic data in tion and more professional training the U.S.” (Mondello et al. 2004). will be as important as technolAn information-based economy ogy in increasing their companies’ where information about place is market share in new professional deemed fundamental to conductservice industries. ing business necessitates a rapid 18
ACSM BULLETIN june 2010
GIS has been defined as: “an organized collection of computer hardware, software, geographic data, and personnel designed to efficiently capture, store, update, manipulate, analyze, and display all forms of geographically referenced information” [U.S. Fish and Wildlife Service, 2006]. “a computer-based tool for mapping and analyzing feature events on Earth. GIS technology integrates common database operations, such as query and statistical analysis, with maps” [ESRI, 2006]. GIScience has been defined by Goodchild (1992), who coined the term, as: “a multidisciplinary research enterprise that addresses the nature of geographic information and the application of geospatial technologies to basic scientific questions.” The definition adopted in the 2006 GIS/GIScience Body of Knowledge (DiBiase et al., 2006) is: “the science behind or underlying geographic information systems technologies and their applications.”
growth in the GIS/GIScience labor force. It should therefore not come as a surprise that the U.S. Department of Labor (n.d.) identified geospatial occupations as one of the 12 high-growth employment sectors for the 2000-2010 period. An article in Nature by Gewin (2004) reported that “geotechnology” has become “one of the three most important emerging and evolving fields in the U.S. economy, along with nanotechnolgy and biotechnology. Job opportunities in the field were expected to grow and diversify as “geospatial” technologies proved their value in ever more areas.” Labor Department’s projections of up to 29 percent growth by 2010 supports this claim. No hard data are currently available on whether this trend will continue into the next decade. Being considered a high-growth job-creation sector is not enough; we need to know where the jobs are now and where they are likely to be as the geospatial field evolves and uses of its products diversify. A useful starting point is Mondello et al. (2004) study which reported that 175,000 workers were employed in the domestic remote sensing and geospatial information industries in 2004. Four years earlier, ESRI (Environmental Systems Research Institute) noted that “some 500,OOO individuals in the U.S.
used GIS software at work, and that 50,000 were full-time GIS specialists” (Phoenix, 2000).
gis competence levels
[in ascending order] 1. Public awareness of GIS and its uses 2. Basic spatial and computer understanding 3. Routine use of basic GIS software 4. Higher-level modeling applications of GIS 5. Design and development of GIS applications 6. Design of geographic information systems, and 7. GIS research and development [Source: DiBiase et al., 2006; Marble, 1997]
Ten years later, the employment picture of the geospatial industry remains patchy. Part of the uncertainty is due to the rapid evolution of GIS science and technology. There has also been some discussion as to which occupations should be counted as “geospatial occupations. The Department of Labor’s 12 major geospatial-related employment categories include such traditional occupations as cartographers and photogrammetrists, surveyors, and surveying and mapping technicians. Listed in addition are architectural and civil drafters, civil engineering technicians, mechanical and electrical drafters, electrical and electronic engineers, mechanical and industrial engineering technicians, envi-
ronmental engineering technicians, and geoscientists. In terms of employment growth, environmental engineering grew fastest in the past decade (by 29 percent), followed by surveying and mapping (25 percent), and electrical drafting (23 percent). GIS professionals, according to a guide to college majors in GIS, often find work with federal agencies like the U.S. Geological Survey, Bureau of Land Management, Army Corps of Engineers, Forest Service, National Oceanic and Atmospheric Administration, National Imagery and Mapping Agency, and Federal Emergency Management Agency. However, the vast majority of available jobs are with engineering, architectural, and technology firms. selected employment opportunities for gis professionals
[in alphabetical order] 1. Application development 2. Data acquisition, analysis, and interpretation 3. Data management 4. Interorganizational facilitation and communication [Coordination] 5. Marketing 6. Project management 7. Systems analysis 8. Training 9. Visualization [Source: Gaudet et a;/ 2003, p. 25]
Advanced technologies continue to increase the productivity of GIS workers, and GIS accuracy june 2010 ACSM BULLETIN 19
ACSM BULLETIN june 2010
our understanding of the built and natural world have accelerated demand for geographic information by the commercial, government, and private sectors. Thus, going forward, the challenge will be to re-examine and refine the levels of GIS/GIScience competence [see sidebars], determine their share in current employment figures, and identify their roles in future workforce development. references
Bartsch, K.J. 2009. Employment outlook: 2008-2018—The employment projections for 2008-18. Monthly Labor Review November 2009. Ball, M. 2010. What are the implications of “Mobile First” for the geospatial industry? v1 Newsletter. June 4, 2010. Bureau of Labor Statistics. n.d. Occupational outlook handbook, 2010-11 ed. Surveyors, Cartographers, Photogrammetrists, and Surveying and Mapping Technicians. [http://www.bls. gov/oco/ocos040.htm; June 2010]. Committee on Beyond Mapping: The Challenges of New Technologies in the Geographic Information Sciences, The Mapping Science Committee, National Research Council. 2006. Beyond mapping: Meeting national needs through enhanced geographic information science. National Academy of Sciences, Washington, D.C. diBiase, D., M. DeMers, A. Johnson, K. Kemp, A. Luch, B. Plewe, and E.Wentz, eds. 2006. Geographic information science and technology body of knowledge, 1st ed. UCGIS Education Committee. Association of American Geographers, Washington, D.C. FGDC (Federal Geographic Data Committee). 2005. Status of FGDC standards.
Jobs, p. 22, 2nd col.
should increase with the use of ena over the Earth’s surface, the sources of geospatial information surveying expertise and devices. have diversified as well [see sideBecause of greater productivbar about “logic and convenience” ity, most of the job openings will with respect to data sources]. occur when workers transfer to other occupations or leave the labor force altogether. However, surveyLogic vs Convenience ing and mapping technician jobs are While increasingly more powerful expected to grow faster than averdata gathering and computing age, precisely because of the greater capabilities provide new and emphasis on accuracy in GIS. exciting opportunities for The decline in digital technology examining geospatial information costs is expected to benefit the from a wide array of perspectives, employment outlook for geospatial we must understand that the professionals. This service, which appropriate use of that information was once limited to major comis critical. This is especially true panies and federal agencies, has as we develop cadastral layers expanded considerably. Now, small for land parcels. Insuring that companies and government agenthe convenience provided by cies can afford to purchase their own technology and the relatively easy GIS programs and, frequently, bring a use of positioning tools does not GIS professional on board. undermine logic in the way we Notwithstanding these encourdefine and locate land boundaries aging trends, the GIS/GIScience is a critical element of protecting labor force may still not be large the integrity of our nation’s enough to provide real-time maptraditional principles of land tenure ping capabilities for informed deciand ownership rights. — Curt sion-making about such issues Sumner, ACSM Executive Director as smart growth, environmental preservation, and adequate water Mobile devices with increasand sewage systems. ingly more powerful computing Information displayed on GIS maps is part of responses to emer- capabilities have encouraged gency 911 calls, weather forecast- “citizen science” where individuals become “sensors” recording locaing, air traffic control, crop monitoring, search and rescue, disaster tion-aware changes in their environment (Ball, 2010) and contributresponse, anthropology, forestry, ing that information to national and genome mapping, and national international mapping efforts. security. And while the range of Mainstream geospatial domains disciplines using map products as tools has diversified beyond those where location is used to solve business problems and to increase investigating geographic phenom-
ARE YOU A CST?
INCREASE YOUR OPPORTUNITIES BY BECOMING
A CERTIFIED SURVEY TECHNICIAN
click on CERTIFIED SURVEY TECHNICIAN tab
NSPS CERTIFIED SURVEY TECHNICIAN (CST) PROGRAM Phone: 240.632.9716 ext. 113 E-mail: <firstname.lastname@example.org>
NSPS, 6 Montgomery Village Avenue, Suite 403, Gaithersburg, MD 20879
Jobs, from p. 20
CHIEF OF SURVEYS (CENTRAL VALLEY) Bachelor Degree in Engineering, Surveying or related field 5 years experience Send resume to: KC Engineering and Land Surveying PC Attn: President 370 7th Avenue New York, NY 10001.
Bachelor degree in engineering, surveying or related field 5 years experience
Send resume to: Montrose Surveying Co. LLP Attn: Saeid Jalilvand 116-20 Metropolitan Avenue Richmond Hill, NY 11418
[http://www.fgdc.gov/standards/status/textstatus.html; Jan. 2006]. Guide to College Majors in Geographic Systems. n.d. Accessed at WorldWideLearn, an online directory of education. Gudet, G.H., H.M Annulis, and J. Carr. 2003. Building the geospatial workforce. URISA Journal 15(1): 21-30 Gewin, V. 2004. Mapping opportunities. Nature 427(6972):376-7. Goodchild, M.F. 1992. Geographical information science. International Journal of Geographic Information Systems 6(1): 31-45. Marble, D.F. 1997. Rebuilding the top of the pyramid: Structuring GIS education to effectively support GIS development and geographic research. Keynote address, GIS in Higher Education Workshop. Washington, D.C., October 1997. [http://www.fes. uwaterloo.ca/crs/gp555/marble.pdf; Dec. 2005] Mondello, C., G.F. Hepner, and R.A. Williamson. 2004. Tenyear industry forecast, phases I-III, study documentation. Photogrammetric Engineering and Remote Sensing 70(1): 5-58. Phoenix, M. 2000. Geography and the demand for GIS education. Association of American Geographers Newsletter 35(6): 13. Point of Beginning. May 2010. Highlights from POB’s Annual Salary & Benefits Study. [http://www.pobonline.com/Articles/]. U.S. Department of Labor. n.d.. Geospatial—high-growth industry profile. [http://www.learningconcepts.net/images/Profile-geoindustry.pdf; Jan. 2006]. Wyatt, I.D., and K.J. Byun. 2009. Employment outlook: 20082018—The U.S. economy to 2018: From recession to recovery. Monthly Labor Review November 2009.
Rand McNally’s IntelliRoute TND 700, a new truck GPS device with a 7-inch state-of-the-art high-definition screen and enhanced software features 22
ACSM BULLETIN june 2010
Sokkia’s Series 50X Total Stations feature increased range, speed and expanded functionality
NOAA rolls out
Plans for a modernized National Spatial Reference System
—by Renee Shields, Height Modernization Manager, and Dru Smith, NGS Chief Geodesist
n 2008, the National Oceanic and Atmospheric Administration’s (NOAA) National Geodetic Survey (NGS) released a ten-year plan announcing the intention to improve and modernize the National Spatial Reference System (NSRS)—the official U.S. government source for determining precise latitude, longitude, and elevation. The proposed improvements to the reference system will take greater advantage of newer Global Positioning System (GPS), mapping, and charting technologies. Changes to the system will impact civilian and federal mapping authorities, as well as state and municipal governments that have adopted the NSRS. NGS held a Federal Geospatial Summit at NOAA headquarters in Silver Spring, Maryland, on May 11-12, marking the beginning of a dialogue with users on the necessary changes to infrastructure and operating methodologies. The May meeting was the first of several such meetings NGS plans to conduct over the next 8 to 10 years with NSRS users responsible for a variety of federal surveying and mapping activities. Representatives from 22 federal, 10 state, and 3 international agencies, along with numerous private sector contractors and academic institutions, attended the Summit in person or via a webinar. The Summit opened with a keynote address by NOAA Chief Information Officer, Joseph Klimavicz, who spoke of NGS’ 200-year history, its beginnings as the U.S. Coast and Geodetic Survey, and the development of the infrastructure which became the NSRS. Klimavicz explained how “the GPS technology has transformed the way we do business,” adding that NOAA was holding the Summit “to share information as a community and to ensure that [NOAA’s] modernization of the NSRS is performed
in coordination with … federal partners in the geospatial community [as well as] partners from states, municipalities, industry, and academia.” Following the keynote, NGS Director, Juliana Blackwell, welcomed the participants and set the tone for two days of discussion on how the plan to modernize the NSRS would impact the products and services of the mapping programs in other federal agencies. Past NGS Directors, RADM.(ret) John Bossler and Dave Zilkoski, each gave a presentation providing an insight into the issues which were important in the 1970s through to the 1990s when they led projects to redefine the horizontal and vertical datums. The directors’ remarks were followed by presentations focused on the current technical challenges. Dr. Dru Smith, NGS Chief Geodesist, explained why modernizing the NSRS is necessary and who would benefit most. He was followed by Dr. Dan Roman, the lead scientist on the team responsible for developing NGS’ geoid model, and Dr. Richard Snay, recently retired Chief of the NGS Spatial Reference System Division and one of the early leaders in the development of Continuously Operating Reference Stations (CORS). Together, their presentations provided a clear picture of how the redefinition of the two major components of the NSRS, the geopotential and geometric datums, will make it possible to provide latitude, longitude, and ellipsoid and orthometric height to the geospatial community more efficiently and with greater accuracy. Dave Doyle, NGS Chief Geodetic Surveyor, and Master of Ceremonies kept the event moving smoothly from presentations to moderated panel sessions during which NGS scientists and policy-makers fielded questions from the audience. In addition to the panel sessions, agencies were invited to sign up for “Minute Sessions.” These three-
to-five-minute time slots enabled agencies to present their most critical issues, concerns, and comments. The first discussion panel focused on issues surrounding the modernization of the geometric datum—currently the North American Datum of 1983 (NAD 83)—realized as a network of over 1400 CORS and almost 70,000 passive control marks. The discussion centered on the reasons for changing the datum, whether the datum should be dynamic, and what kinds of shifts to expect across the country. The focus then turned to the geopotential (vertical) datum—currently the North American Vertical Datum of 1988 (NAVD 88)—defined through a national leveling network. The arguments for modernizing the vertical datum were more obvious, and the need for well monitored heights was apparent from the questions raised by the audience. During the Minute Sessions the next day, the focus was on how the transition would impact the products and services provided by federal agencies. NGS heard support for the modernization of the datums, but also of the need for NGS to provide tools and education to facilitate their adoption. Participants reminded NGS of their reliance on passive control monumentation, their need to maintain ties to legacy data, and voiced concerns about the large resources needed to convert all their products to the new datums. Agencies and vendors offered to collaborate with NGS in building the needed tools and encouraged NGS to hold follow-up summits to prepare stakeholders for the change. A website will be created to provide information to stakeholders as they move forward with their ten-year plan. The outcomes of the May Summit can be found at http://www.ngs.noaa. gov/2010Summit/. A webcast and transcripts will soon be made available, and proceedings are expected to be published by the end of the summer. 2010 ACSM BULLETIN 23
Improving the National Spatial Reference System —From a White Paper by Dr. Dru Smith
The future of positioning is GNSS1. The underlying reference frames for all GNSS systems are geocentric. The International Terrestrial Reference Frame (ITRF), used for globally consistent scientific applications such as the determination of sea level change has become progressively more geocentric over the last ten years, so that now the origin of the ITRF coincides with Earth’s center to about 1 centimeter of accuracy. Furthermore, countries are increasingly choosing GNSS as their primary tool to access a vertical datum, minimizing their reliance on not monitored passive control. In the United States, the official geometric, historically called “horizontal”, datum, NAD 832, has a known nongeocentricity of over two meters and the official vertical datum, NAVD 883, is accessed through a set of passive control that is fragile, inaccurate, and rapidly deteriorating. The National Geodetic Survey (NGS) at the National Oceanic and Atmospheric Administration (NOAA) is working to define and adopt a geocentric reference datum for the United States to replace NAD 83. The Agency is also working to compute an accurate geoid model which will serve as the defining surface of a new vertical datum accessed through GNSS technology and which replaces NAVD 88. These two changes are dependent upon one another in a variety of ways and are currently planned to occur simultaneously. The decision to proceed with these changes was both obvious and difficult as NGS is cognizant of two important but conflicting needs in the user community: accuracy and constancy. To fulfill its mandate to provide the geodetic reference frame for all geospatial activities in the United States, NGS must strive to be as scientifically accurate as possible. After much internal discussion, NGS determined that it must address serious issues of inaccuracy in the current realizations of NAD 83 and NAVD 88. However, NGS recognizes that significant user resources have been invested in the current realizations of the two datums. In order to continue improving accuracy while minimizing the impact of new reference frame paradigms, NGS is working to implement this transition over the next ten years. This will allow time for the user community to voice concerns, for NGS to address them, and to ensure that the transition will go as smoothly as possible.
Replacing the North American Vertical Datum of 1988 as the official U.S. Vertical Datum Significant changes to the science and methodology of geodetic leveling occurred during the mid-20th century. A widespread multi-agency effort to collect terrestrial gravity measurements, development of new corrections to leveling and a deeper understanding of the differences between local mean sea level (LMSL) at disparate tide gages all called into question the accuracy and reliability of the National Geodetic Vertical Datum of 1929 (NGVD 29). These improvements in scientific knowledge, and the new 625,000 km of leveling (including 81,500 km of 1st order re-leveling) performed post-NGVD 29 were used to create the North American Vertical Datum of 1988 (NAVD 88). NAVD 88 was a major improvement over NGVD 29, however no nationwide effort to re-adjust NAVD 88 has been made since its inception. Some localized leveling has allowed for original heights to be superseded, and in some cases (e.g. Louisiana) a number of questionable heights have been removed in favor of updated leveling and GPS-based heights. Without an active maintenance plan, current regional distortions in the network are already impacting its value and effectiveness. Because of known problems in the original realization of NAVD 88, and ongoing problems in the very nature of a passive-mark based system of vertical geodetic control, NGS proposed in their 10 year plan (NGS, 2008) that “a new geopotential datum… is defined and realized through the combination of GNSS technology and gravity field modeling.” There are six major issues with NAVD 88 which warrant its replacement: 1) Cross-country accumulation of errors from geodetic leveling; 2) Fragility and location of passive marks; 3) Bias in the NAVD 88 H=0 reference surface as compared to the geoid; 4) Subsidence, uplift, and other crustal motions; 5) Sea level change; and 6) Changes to Earth’s gravity field. The entire document is accessible at [http://www.ngs. noaa.gov/2010Summit/.
1 Global Navigation Satellite Systems—All constellations of positioning satellites including GPS, Galileo (Europe), GLONASS (Russia) and Compass (China). 2 The North American Datum of 1983. 3 The North American Vertical Datum of 1988.
ACSM BULLETIN june 2010
Introduction to the Semantic Web— Part 1 This edition of DIY GIScience doesn’t emphasize the DIY part as work, in particular, over the Internet, in the form of a Uniform much. Instead, I’m presenting a primer in two parts on what Resource Locator (URL). At the time, the Internet was an ecosystem of servers and some are calling Web 3.0: the Semantic Web. Let’s begin with some historical context and then I will present two of the key clients with specific, narrow purposes like File Transfer (FTP), remote access (telnet), news feeds (NNTP) and games (MUD). concepts that underlie this compelling technology. In 1989, Sir Tim Berners-Lee had an idea: What if you could The nascent World Wide Web and the proposed hypertext make a simple method that would have one document point to transfer protocol (HTTP) which made it work were just one of another? This wasn’t a new idea. In 1945, Vannaver Bush (no hundreds of different means of exchanging information. But this relation to George W. Bush) proposed a system which used particular method caught on in a big way. So big that today we indexed microfilm that would mimic the “intricate web of trails equate “the Internet” with “the World Wide Web” and ignore carried by the cells of the brain.” Bush wanted his Memex the rest of that ecosystem. The World Wide Web in many ways is Ted Nelson’s Xanadu system to store all of an individual’s books, records, and communications in a way which would allow for their instant and Vannaver Bush’s Memex. I can find the answer to the most retrieval. In 1965, Ted Nelson coined the term “hypertext” and trivial questions with a quick Bing search. I can listen to my later founded a project called Xanadu, a database in which all favorite music and even find esoteric recordings with web serhuman knowledge could be stored and retrieved. Apple Computer vices like Pandora. My written correspondences live in my webcreated the Hypercard software for the Macintosh in 1987 which based e-mail account (archive, don’t delete!). Even my voicemail has become absorbed into the web via Google Voice. combined linked information with a graphical user interface. But this Xanadu isn’t a panacea, and our knight hasn’t slain What made Berners-Lee’s idea unique was the network. Instead of building a single piece of software to manipulate a his last dragon. Sir Tim Berners-Lee has been on a crusade database or files stuck on one machine, the future knight (Tim to do for your raw data what he did for your documents. The Berners-Lee was named Knight of the British Empire in 2004) problem with the World Wide Web, as it exists today, is that created two mechanisms: one which would deliver files on all of the information is hidden inside chunks of text or blobs demand and another which would display the files and let the of media. Berners-Lee has been working on what is called the user request others. The requests would be made over a net- Semantic Web as a means to expose these hidden meanings. june 2010 ACSM BULLETIN 25
Semantic web, p. 26
—by Eric Wolf
You could form triples like this: George hasPhoneNumber 303-404-5555 “George” is the subject. “hasPhoneNumber” is the predicate and “303-404-5555” is the object. Each triple of information can be accessed via a URI. URIs happen to look like the URL from the World Wide Web. In fact, a URL is considered a special case of a URI. So a Semantic Web application could use a URI such as “http://friends. org/contacts/george/12345” to access that value. On the Semantic Web, instead of addressing pages where information is embedded in context, the raw data are exposed. 26
ACSM BULLETIN june 2010
Perhaps the most powerful part of the Semantic Web is that the objects need not be values. Instead, a triple could look like this: Fred hasFriend http://friends.org/contacts/george/12345 The object could be a URI connecting two pieces of data. In relational database systems, connections between two kinds of data are mapped through relations. When a relation is established, every similar piece of information (other records in the table) has the same kind of relations. In the Semantic Web, the relations can be more fine-grained. And since the “database” is now a “web,” two pieces of information need not be in the same system. If you use Facebook and LinkedIn, you have a great deal of duplicated information. Your basic contact information, your friends, even your picture, may all be duplicated information. If you get a new cell phone, you have to change your contact information in both systems separately. If Facebook and LinkedIn used Semantic Web technology, there could be one set of contact information that each site would access. Have you ever tried to use a geospatial data clearinghouse— e.g., the Geospatial One-Stop (http://gos.geodata.gov)—to find a dataset? Frequently, clearinghouses don’t give raw data. Instead they may simply index the metadata records. The metadata may include contact information for someone who
Semantic web, p. 28, 2nd col.
Like the World Wide Web, the Semantic Web starts off with two basic concepts: all data is to be broken into “triples” and each triple can be accessed via a Uniform Resource Indicator (URI). A triple consists of a subject, predicate, and an object. The subject is the thing the triple is about. The object is something that modifies or describes the subject. And the predicate determines how the object modifies or describes the subject. For example, say you have a database table that looks like this:
! p a M . r D Ask Dear Dr. Map,
Q: I want?
How can I find out solar times, such as sunrise and sunset, for any location
Dr. Map recommends the NOAA Solar calculator, on the web at www.esri.noaa. A: gov/gmd/grad/solcalc. Through a GoogleMapslike interface, you can find any place on Earth, then solve for solar noon, sunrise, or sunset times. For example, I set the CITY parameter to Washington, D.C. and the date to June 9th, 2010, and got the sunrise of 04:43 and apparent sunset of 19:32, for a length of day of 14 hours and 49 minutes at 38.88o north. On the solstice (June 21st), the day length is only five minutes longer and the sunrise is at the same time, so we’re close to the “longest day.” Note that the equation of time is rather tricky; you need to know your time zone, position, and daylight saving time status. How has the recession changed the population distribution in the U.S.?
The recession, which formally started in December 2007, has so far cost the U.S. economy some eight million jobs. Many of these, and the numerous home foreclosures, have forced people to move location, and these moves are starting to show up on maps. New Census data released in March show a rapid shift in migration in the worst months of the recession, from July 2008 to July 2009, away from the Sunbelt states. Many people have decided to stay put or to move back to where they came from. An interactive on-line map by the Wall Street journal allows you to explore the trends over time, and to see the migration data for any particular U.S. city. [See: http:// online.wsj.com/article/SB1000142405274870421170
4575140132450524648.html#articleTabs%3Dinteractive]. Big losers have been New York and Los Angeles, while some modest population gains have taken place in Texas, Atlanta, and elsewhere. The continued housing foreclosure situation, mapped nicely by RealtyTrac at http://www.foreclosurepulse.com/ blogs/mainblog/archive/2009/08/12/os-the-foreclosuredam-starting-to-break.aspx shows the number of foreclosures in July 2009 as a proportion of all housing units, ranging from a low of about 1 in 63,290 to a high of 1 in 38. The worst spots? California, Nevada, Arizona, and the Pacific Northwest coast. Other bad patches are Southern Michigan, Florida, and cities from Colorado to Maine. Dr. june 2010 ACSM BULLETIN 27
Aksk Dr Map!, p. 29
Semantic web, from p. 26
“Technomanifestos” The “web” is called the “web” due to its non-linear nature, with hypertext and links taking our thoughts sideways or horizontally (like a web), rather than in a straight, linear fashion (Brate, 2002). Brate (2002) states that the web works the same way as the human mind, as our thoughts constantly jump and switch subjects. We do not think in the linear way that a book or manual might read, our thoughts are more random and abstract. Recently I have been concerned that I cannot seem to maintain linear conversations. Brate’s theory has comforted me in that I now know that my seemingly out of order thought processing is in fact, quite normal. The web, like our minds, contains a huge amount of information. The good, the bad and the ugly. Every creation is either a reflection of reality, or perhaps the embodiment of a fantasy, an imagination of the human mind. Kevin Kelley wrote an article for “Wired” online magazine entitled “We are the Web.“ Kelley states that “we are the web” because we create, update, contribute, alter, debate and discuss what is there. Without us, there would be no web. I think the web continues to grow and develop as we (humanity) keep growing and adding to its content. The web, like the world, does not discriminate against who or what is there it just is, as we just are. Politics comes into play on the web as it does in real life, because its contents are simply an extension of ourselves (with an IP address).—Excerpts from a blog on http://anothermediastudent.wordpress.com. Bibliography Adam Brate. 2002. Everything is deeply intertwingled. In: Ted Nelson and Tim Berners-Lee (eds), Technomanifestos: Visions from the information revolutionaries, New York and London: Texere, 2002. Kevin Kelly, We are the We. Wired, August 2005.
The future of government services— Gov 2.0 In the latest SpatialRoundtable.com discussion on May 12th, ESRI industry solutions manager Christopher Thomas addressed the emerging trend of governments using Web 2.0 technology to improve service—Gov 2.0. Executives from all levels of government, as well as media and geographic information system (GIS) thought leaders, were engaged in a dynamic online conversation to respond to the question: “Can the GIS community provide a platform for engagement that empowers citizens?” 28
ACSM BULLETIN june 2010
The British-born software engineer brought order to cyberspace by creating the World Wide Web (http://www.time.com). He is also behind the semantic web concept.
is supposed to be able to provide the data but that information may be out of date. You get a disconnected number or they don’t work for the data provider anymore. If these clearinghouses used the Semantic Web, even if you didn’t get access to the raw data, the metadata would likely point to a dataset maintained by the data provider. When the contact person for a set of data changed, the record would be updated and you would be more successful tracking down needed information. The Semantic Web arises out of a need to break data out of silos of disconnected databases and the contextual shells of web pages. But this only scratches the surface of the capability and complexity of the Semantic Web. In part two of this article, I will explain how the Semantic Web enables more intelligent searching. How, if Sir Tim Berners-Lee’s vision holds out, instead of feeding bread crumbs into a search engine and sorting through piles of meaningless links, intelligent agents will piece together the raw data from the right places to provide solutions.
“Government needs to meet high expectation levels,” says Thomas. “Citizens want to interact with their government online, and GIS is playing an important role in delivering these valuable Web services.” There are two kinds of Gov 2.0 enthusiasts. One group is engaged in studying emerging technology, including cloud computing and crowd sourcing, while the other primarily sees Gov 2.0 as a movement to improve government. But a separate group—the largest group—comprises citizens who are generally unaware of Gov 2.0. In this environment, GIS is the key platform for delivering transparency and accountability.
Ask Dr. Map!, from p. 27
Map wishes there were better news to share, but meanwhile, at least we can map the misery! A few minor bright spots are the top ten places where house values are rising: McAllen, TX; Rochester, NY; Birmingham, AL; Syracuse, NY; Buffalo, NY; New Orleans, LA; Scranton, PA; Grand Rapids, MI; Baton Rouge, LA; and El Paso, TX. In most of those places, your housing dollar will still go quite a long way.
What is the Bizarre Map Challenge?
The Bizarre Map Challenge is “a map design competition open to high school, college, and university students in the United States. The goals of this challenge are to promote spatial thinking, increase awareness of geospatial technology, and inspire curiosity about geographic patterns and map representation in students and the broader public” [see http:// bizarremap.sdsu.edu]. The challenge winners were announced in April of this year. First prize went to Christopher Brown of the University of Alabama for a map entitled “Alligator Bayou.” This is a pattern of French-origin “long lots” that follow the Mississippi River and a tributary which, when shown together, make up a pattern resembling the jaws of an alligator.
Other top ten entries included a map showing the spatial distribution of missed love connections posted in the “I-Spy” section of a free newspaper from Burlington, Vermont; the U.S. map shown as a celestial star map; and many others. Dr. Map enjoyed the “Earth in Reverse,” a world map showing terrain colors upside down and backwards. The Bizarre Map Challenge competition was supported by the National GeoTech Center and San Diego State University. Credit for the idea and the execution go to Ming-Hsiang Tsou, a faculty member at SDSU. Dr. Map approves of bizarre maps, and laments that there are too few of them to lighten up one’s cartographic day.
Dr. Map has a PhD and a cartographic license. Send questions to Dr. Map at email@example.com or visit him on the web at http://www.drmap.info
Spy vs. spy on Facebook —by Monica Hesse
On December 5, 2009, the Defense Advanced Research Projects Agency set out to learn how quickly people could use online social networks to solve a problem of national scope. The answer? 8 hours 56 minutes, at least when said problem involved $40,000 and a bunch of red balloons. In DARPA’s Network Challenge, tied to the 40-year anniversary of the Internet, the Department of Defense’s research arm placed 10 weather balloons in public places around the country. The first team to locate and submit the balloons’ correct geographic coordinates would get the cash prize. Ready, set, Twitter!
More than 4,000 teams participated. More than a few interesting things were revealed about the human psyche. “It’s a huge game-theory simulation,” said Norman Whitaker of DARPA’s Transformational Convergence Technology Office. The only way to win the hunt was to find the location of every balloon, but a savvy participant would withhold his sighting until he’d amassed the other nine locations, or disseminated false information to throw others off the trail. Sure enough, Twitter and Facebook were all abuzz with offers to sell coordinates for alleged sightings. There was much excitement over the red balloon in Providence, R.I. There was no red balloon in Providence—just a Photoshopped decoy circulated by a conniving player. The winning team was spearheaded by Riley Crane, a postdoctoral research fellow at MIT’s Media Lab. MIT’s team set up an elaborate information-gathering pyramid. Each balloon was allotted $4,000. The first person to spot one would be awarded $2,000, while the people who referred them to the team would get smaller amounts based on where they fell on the info chain. Any leftover money, after payment to spotters and their friends, was to be donated to charity. Crane was less interested in the monetary prize than in the potential for social research. “On the science side, we’re scratching the surface of this
ACSM BULLETIN june 2010
tremendous new system” of social networks. “With this data set we have the potential to understand how to face—and exploit—the challenges that come with living in this interconnected world.” The practical possibilities of the Network Challenge go far beyond a research lab. Already the powers of social networks are well documented: Earlier in 2009, information about violence in Iran continued to be dispersed through Twitter even after traditional news sources were squelched. Crane wondered what types of applications might result from data about information dispersal collected during the DARPA Network Challenge weekend: “Could we design an alert system to help us find missing children? Could we redesign the incentive structure for police rewards?” DARPA officials met with participants throughout the week to debrief them on their strategies. Not everyone believed their motives are pure. After all, what would an intersection between the government and the Internet be without a few conspiracy theories?
The DARPA Network Challenge crown went to MIT, the team that was first to submit the locations of 10 weather balloons placed at 10 fixed locations. (Darpa)
“Looks to me that ‘someone’ has lost a balloon with something very important in it, and now is making all this fiction to promote its prompt finding,” wrote a commenter on NewScientist.com. Care to comment, Dr. Whitaker? “That,” he said, while trying to keep a straight face, “is an amazing story.”
Share GIS perspectives with others
Does “measure,” “model,” and “map” apply to oil spills? —by Jeff Thurston Geographic information systems can respond to it effectively and, provide valuable data and information for above all, we need to comeffective decision-making in a multitude municate more—about the oil of real-life situations. Attempting to spill’s impact on people’s livelirespond to an oil spill for example, withhoods, their health, and the out making good use of GIS tools is not health of the ecosystem that just unwise; it can prolong the response supports life in the Gulf. to the disaster and result in poor inforOne of the technologies mation flows between decision-makers which is already used to capand responders, among responders, and ture the “third dimension’ of between responders and the public. natural and man-made crisis From the moment crude oil spills events is LiDAR (light detecout, it becomes a spatial problem which tion and ranging) mapping. The immediately calls for understanding its State of Louisiana dimension, communicating its size and impact, and is pursuing a coordinating the response to those impacts. Maps are statewide LiDAR like language; through their graphics, symbology, and mapping program text, stories are created and shared. Visually power(http://atlas.lsu. ful maps cut to the chase and provide understanding edu/central/ immediately. la_lidar_project. Oil spills are 2D, 3D, and 4D problems. Consider pdf). Doubtless, the oil spill in the Gulf of Mexico. Originally most comthe geospatial munication focused on location—people wanted to data this program know where the spill was happening. A latitude and has been collecta longitude provided the (x, y) of the location in 2D. ing will prove But as the oil spill continued, its duration became the immensely valuaover-riding issue. Still later, people’s attention turned ble in confronting from the spill’s location and temporal dimension to the the work in the horror of its greasy presence on the surface and in dark Gulf over many columns beneath the surface—the third dimension of years to come. the problem. Literature A number of technologies exist for measuring the abounds in location of 2D and 3D objects. However, if we want to research into GIS and other geospatial technologies which know what the effects of the oil spill might be on the Gulf’s increase our understanding of the world around us and our beaches, for instance, then one needs to look at the probdealings with it. One of the latest among many excellent lem in a more comprehensive manner. Sophisticated tools contributions is Ocean Globe, a book published by Esri, have been developed by GIScience to analyse measurewhich examines the use of bathymetry, GIS, and other ment data contributed by crowd-sourcing, build models of technologies to map the ocean floor. the disaster, its effects, and the likely path to mitigating “Our perception of the ocean floor has expanded through them, and then share this information via a digital map. the use of GIS tools and geospatial applications,” writes I find it odd that these tools are yet to be fully impleJoe Breman, editor of the anthology. “The more we know mented in the Gulf. We need accurate baselines from about the underwater environment, so seldom visited by which reparations can begin now. We need accurate analy- most people, the more our lives will benefit above ground.” sis and visualization of the problem we are confronting to —How very true in the context of the current crisis.
2010 ACSM BULLETIN 31
The Fed Page Tracking the Gulf oil spill
—by Alice Lipowicz The National Oceanic and Atmospheric Administration launched June 15th an exciting new web tool—the GeoPlatform—which offers the general public the same information about the Gulf oil spill that respondents are receiving. The site employs the Environmental Response Management Application (ERMA®), a web-based GIS platform developed through a joint partnership between NOAA and the University of New Hampshire’s Coastal Response Research Center. The interactive mapping features of the tool make it possible for the site to deliver near-real time (updated approximately every 10 minutes) AIS data from the vessels supporting the largest oil spill response and recovery operation in U.S. history. Originally designed for responders who make operational decisions on the oil spill disaster, the platform integrates the latest data on the oil spill’s trajectory, closed fishery areas, wildlife, and Gulf resources—such as oiled shoreline and daily position of research ships—into one customizable, interactive map. Apart from NOAA, the U.S. Coast Guard, the Environmental Protection Agency, the U.S. Fish and Wildlife Service, the U.S. Geological Survey, the Homeland Security Department, NASA and several states are contributing data to the GeoPlatform [http://www.GeoPlatform.gov/gulfresponse]. “This web site provides users with an expansive, yet detailed geographic picture of what’s going on with the spill,” said Jane Lubchenco, NOAA administrator. “It’s a common operational picture that allows the American people to see how their government is responding to the crisis.” A separate public web site—Deepwater Horizon Response—has been offering news, announcements, and information about the disaster. This website is operated by the Deepwater Horizon Unified Command, which consists of DHS, Defense and Interior Departments, other federal agencies, BP, and other private entities. In addition, the Unified Command recently set up a Deepwater Horizon Response Facebook page that links to its other web site. [Alice Lipowicz is a writer for Defense Systems.]
—by Keeley Belva NOAA’s National Geodetic Survey (http://www.ngs.noaa. gov)—the official U.S. government source for determining precise latitude, longitude, and elevation—is implementing a modernization effort which takes into account advances in GPS and other technologies. The effort is important to all activities requiring accurate positioning information, including levee construction projects, the design of evacuation routes in hurricaneprone areas, and the forecast of sea-level rise in coastal com32
ACSM BULLETIN june 2010
munities. The modernized National Spatial Reference System will take even greater advantage of newer technologies and better track changes in position and elevation over time to improve and update digital maps. The proposed changes in determining position and elevation over time will improve and update digital maps and will have a bearing on the work of civilian federal mapping authorities, as well as state and municipal governments that have adopted the National Spatial Reference System. “The reference frame in the past was hampered by being held static in time on an Earth that is constantly changing,” said Juliana Blackwell, director of NOAA’s National Geodetic Survey. “The new methodologies better capture changes such as subsidence or sea level rise, and the improved points of reference benefit everyone using positioning data for the foundation of their work.” A modernized reference system will allow users to easily calculate accurate positions using a survey-grade GPS receiver in conjunction with a scientific model of Earth’s gravity field. In 2009, a NOAA commissioned, independent socio-economic study estimated the value of these modernization efforts to be $4.8 billion over the next 15 years, including $2.2 billion in avoidance costs from improved floodplain management. “An improved vertical datum means elevation measurements will become more accurate and less expensive, helping the National Flood Insurance Program to reduce the impacts and losses caused by flooding,” said Paul Rooney, a Mapping Technology Specialist at the Federal Emergency Management Agency (FEMA). A Federal Geospatial Summit held at NOAA headquarters in Silver Spring, Md. [and reported on p. 23 of this issue] marked the beginning of a dialogue with users to help plan far in advance for the necessary changes to infrastructure and operating methodologies.
The Tech Page
—by Matt Ball
he concept of Light Detection and Ranging (LiDAR) is really quite simple—it involves tuning into the wavelength, pulse width, and frequency of laser light, bouncing that light off objects, and capturing returning light over time to measure X,Y, and Z dimensions as well as the returning light’s intensity. The technology has proved to be quite useful for capturing 3D terrain and features; and it is being used extensively to map infrastructure and natural resources. Since the technology’s inception in the late 1960s, LiDAR has been applied to atmospheric studies of air quality, marine and hydrographic studies, for bathymetric studies and water quality issues, in surveying and mapping, and for positioning and guidance. The technology has been tweaked and fine tuned for each subsequent application area, funnelling improvements back to technology development. Yet, we’re still just scratching the surface with the capabilities of this technology. intensity returns
Because of its ability to measure and classify different intensity values of the light ray returns, LiDAR can be fine tuned to capture and record a variety of different phenomenon, both visible and invisible. The intensity
can be customised for atmospheric research aimed at determining which elements are present in the air using different signatures from molecules. Something similar can be done with water. The signatures of the elements can then be monitored to understand changes in the atmosphere and the make-up and changes in the composition of our water bodies. Seeing what can’t be seen by the naked eye is a key application of this technology. LiDAR is used to sense, from a distance of as much as one kilometer, such phenomena as variations in soil compaction, which can signal soil pollution, or the health of forests and their susceptibility to fires. The potential of LiDAR in monitoring change in the physical world is truly limitless. But to take advantage of this potential we will need sophisticated LiDAR sensors capable of measuring not only distance but also motion and composition and classifying that information. multi-sensor configurations
High-resolution color images produced by aerial and terrestrial applications of LIDAR capture reality which can then be represented by models as virtual reality. Engineering and design projects, as well as the entertainment and game industries have successfully incorporated virtual LiDAR in their work flows. Adding other sensors to the aerial platform such as hyperspectral or thermal imaging provides further sensing capabilities. Hyperspectral imaging adds to the topographic information of a 3D scan, high-resolution measurements extracted from different color bands of an image. Thermal remote sensors do what their name says; they sense heat. And one of their application is in combating fires, providing information about fire behavior under different weather conditions and providing information about the mitigation approaches chosen.
Multi-sensor configurations are proving to be very desirable for scientific and surveillance applications. Emerging technologies such as LiDAR boost interest in understanding change on our planet, even among the general public. lidar apps
Just as video cameras were once too expensive to purchase, so too are LiDAR sensors, but they will eventually come down in price and become ubiquitous in areas which are in need of constant measurement. One such area is surveillance where a LiDAR system could be used to create a virtual fence and alert a central system when encroaching objects pass a certain distance threshold, or, if their profile composition comprises metals or explosives. With LIDAR sensors deployed on satellites, we have constant measurement from space, on a global scale. With increasing speed of data classification and analysis, we’ll gain a much greater understanding of change. And, with increasing overlap of sensors at various scales, we’ll be able to aggregate these different measurements for a much greater understanding of the whole, be it at a regional or a country scale, and all the way down to millimetres of accuracy on the ground. overcoming data limitations
The biggest technological hurdles for greater LIDAR utility have been the large digital storage space required to house the measurements and the computing power needed to visualize and analyse the data. The rapidly dropping prices in computer storage and capacity are easing some of these burdens, and cloud computing is providing new ways of dealing with the data and visualization limitations. By harnessing large numbers of computers, analysts can dive into the details of the data faster and much easier. The ability to store large
june 2010 ACSM BULLETIN 33
LiDAR potential, p. 34, 2nd col.
How can it be that we’re only scratching the surface of LiDAR’s potential?
Oil spill reveals the dangers of success —by Robert J. Samuelson An intriguing aspect of the BP oil spill is that, before the accident, deepwater drilling seemed to be a technological triumph. About 80 percent of the Gulf of Mexico’s recent oil production has come from deepwater operations, defined as at water depths exceeding 1,000 feet. In 1996, that was 20 percent. Jack-up rigs, which are oil platforms on stilts in a few hundred feet of water, have given way to the “mobile offshore drilling unit” (MODU) which keeps its position through the interaction of global positioning satellites and on-board engines which activate directional propellers to offset ocean currents and wind. Seismology and submersible robotic technology have also advanced. The Deepwater Horizon rig was not testing new limits. It was drilling in about 5,000 feet of water when others have approached 10,000 feet. The safety record was good. The American Petroleum Institute, the industry’s main trade group, says that since 1947, oil companies have drilled more than 42,000 wells in the Gulf of Mexico and recovered about 16.5 billion barrels of oil. Against that, spills totaled about 176,000 barrels from 1969 to 2007. In a typical year, it was a few hundred barrels. By contrast, recent production is about 1.6 million barrels a day, and the latest estimates give the current spill in the Gulf in millions of gallons. There will be extensive analysis of the causes and ultimate impacts of the Gulf oil spill, but the stark contrast between the disaster’s magnitude and the previous safety record points to a perverse possibility: The success of deepwater drilling led to failure. It sowed overconfidence. Continuing achievements obscured the dangers. This pattern applies to other national setbacks. Consider the financial crisis. It was not the inherent complexity of subprime mortgages or collateralized debt obligations (CDOs) that caused the crisis. It was the willingness of presumably sophisticated investors to hold these securities while ignoring the complexity and underlying risks. This behavior was understandable at the time. The economy seemed to have become less risky. High inflation had been suppressed, and economists talked of the “Great Moderation.” The belief that past economic and financial instability had been quelled, however, created future instability by encouraging risky behavior. Or take the Toyota woes with their faulty accelerators. Few auto companies enjoyed as envious a reputation as did Toyota until recently. The auto maker consistently did well in surveys of reliability and customer satisfaction. This success-—and the image of Toyota cars as the safest on the market—help explain why Toyota reacted so slowly to reports of sudden acceleration. Problems were minimized because they seemed out of character for Toyota. 34
ACSM BULLETIN june 2010
LiDAR potential, from p. 33
amounts of data on shared machines also eases some of the burden of storage management and makes the data much more readily accessible. With easier data storage, analysis and integration capabilities in the cloud, more sensors can be employed for data collection and new applications. The future of LIDAR is bright precisely because of the uniqueness of its sensing capability. In a time when we need more and better means to measure and analyse our world, LIDAR technology is the way to proceed in the coming decades. literature
• • •
Proposed NASA LIDAR Surface Topography Mission: Background. Quantifying structural physical habitat attributes using LiDAR and hyperspectral imagery. Environmental Monitoring and Assessment 159, Dec. 2009. Extracting wildfire characteristics using hyperspectral, LiDAR, and thermal remote sensing systems. Christos Koulas, Proceedings of the SPIE, Vol 7298. 2009.
One theory of the oil spill in the Gulf is that the deepwater technology is inherently so complex and dangerous that it can’t really be understood or regulated. The safety record before the BP spill seems to rebut that. The problem is that the system broke down. Careless mistakes were made. Or regulators were co-opted by industry. Judgments were botched. Something. The post-crisis investigations will presumably fill out the story, but no one has as yet suggested that the blowout reflects a previously unknown geological phenomenon—something in the oil formation—or a quirk of technology that no one could have anticipated. Perhaps studies will reveal one or the other. But the prevailing assumption is that this accident was preventable, meaning that human error was responsible There’s a pattern to our calamities or, at any rate, some of them. Success tends to breed carelessness and complacency. People take more risks because they don’t think they’re taking risks. The regulated and the regulators often react similarly because they’ve shared similar experiences. The financial crisis didn’t occur so much because regulation was absent (many major financial institutions were regulated) but because regulators didn’t grasp the dangers. They, too, were conditioned by a belief in the “Great Moderation” and lower financial volatility. It is human nature to celebrate success by relaxing. The challenge we face is how to acknowledge this urge without being duped by it.
National Center for Airborne Laser Mapping comes to Houston The National Center for Airborne Laser Mapping and the ground-breaking researcher leading it recently moved operations to the University of Houston. By increasing its cadre of laser mapping researchers, the University of Houston will be able to expand its pioneering work in such areas as homeland security, disaster recovery, oil and gas exploration, wind farm site planning, and environmental studies. —by Thomas Shea
impact in our new Houston home,” Shrestha said. “We plan to establish curriculum catered to this specialty and eventually add a graduate degree in geo-sensing systems engineering. This is in addition to carrying out research far surpassing what is capable in laser mapping to date.” Shrestha’s work with laser mapping goes back to the 1990s, when this once niche research area was just making its debut. Bill Carter, now a research professor at University of Houston, and Shrestha’s colleague at the University of Florida helped establish NCALM. “Together, we saw [LiDAR’s] potential to far exceed that of traditional large-scale mapping methods which use cameras to view terrain, ” Carter said. “Laser mapping has the ability to work day or night and even map areas covered by forests and other vegetation—which is impossible to do with photogrammetric methods.”
LiDAR in Houston, p. 36, 2nd col.
Previously based at the University of Florida, the National Center for Airborne Laser Mapping moved, together with its director, Ramesh Shrestha, Hugh Roy and Lillie Cranz Cullen Distinguished Professor of Civil and Environmental Engineering, to Houston this January. The center is sponsored by the National Science Foundation and is a huge boost for the University of Houston, its new host, and the research it conducts on LiDAR (light detection and ranging). Since its establishment in 2003, the center’s focus has been on ground-based scanning laser technology and airborne laser swath mapping research, and this focus is likely to continue. Shrestha brought much of his Florida team with him to Houston, where NCALM is operated jointly with the University of California-Berkley. “With the center, we have brought laser mapping’s uses to the forefront and expect to continue to have this
june 2010 ACSM BULLETIN 35
LiDAR in Houston, from p. 35
ALTA/ACSM Land Title Survey corner by Gary Kent
I have heard rumors of a new version of the Standards for ALTA/ACSM Land Title Surveys. Is there a new version and when will the new standards be effective?
The NSPS Committee on ALTA/ACSM Standards, with input from attorneys on the American Land Title Association’s committee and about 100 interested surveyors, has been working on a new version of the standards for the last year and a half. The document, titled the 2011 Minimum Standard Detail Requirements for ALTA/ACSM Land Title Surveys, is nearing a final draft. Later this summer there will be a joint meeting of members of the NSPS and ALTA committees to review the entire draft and finalize any outstanding issues. The consensus version will then be voted upon by the entire membership of the American Land Title Association at its annual meeting in San Diego in October, and by the NSPS Board of Governors and Board of Directors at its fall meetings in Orlando in November. It is anticipated that the new standards will go into effect on February 1, 2011. The document will be available on-line and in various publications prior to its effective date. A future column will provide more details on the new version, but in addition to a number of changes and revisions, the 2011 ALTA standards will represent a major rewrite of the current standards, with various parts completely reorganized.
ACSM BULLETIN june 2010
It wasn’t long before others at the University of Florida warmed up to LiDAR. This was inevitable, especially after the two scientists developed techniques to remove and minimize some of the errors seen in the early years and the equipment was fine-tuned to collect large quantities of data. As many as 167,000 points per second can now be mapped with LiDAR compared to the 3,000 when the technology was in its infancy. Shrestha’s and Carter’s research has changed the way the State of Florida monitors erosion on its coastline. They produced the highest resolution 3-D images in existence of the San Andreas Fault, and their LiDAR expertise has taken them across the globe to map Mayan Ruins in Belize and volcanoes in Hawaii. Evaluations made using LiDAR before and after a landfall of a hurricane or an earthquake can be used to improve such things as building design, as well as develop powerful predictive methods to better anticipate the response needed in future catastrophic events. Future NCALM research work will explore the possibility of using LiDAR to map a variety of phenomena be they glacial movements on the North Pole or the migration of penguin colonies in Antarctica. With LiDAR, the ground’s surface is “sensed” remotely using a small plane, such as the Cessna 337 Skymaster operated by NCALM, and laser pulses. Thousands of small cone-shaped pulses travel through a hole at the bottom of the plane to the ground and a unique detector picks up rays reflected from a point on the ground. Then each point’s distance is determined by measuring the time delay between the transmission of a pulse and the detection of reflected signals. The plane’s location and movement in the air are tracked by an inertial measurement unit fixed inside the laser system using a GPS receiver mounted to the plane and others stationed on the ground. Both the location of a plane and the speed of its movement are used, along with the laser data, to produce detailed 3-D topographic images of the terrain. “In coming years, our group plans to develop a next-generation LiDAR system. The unit would be less expensive than commercially available systems and allow for some of the most accurate, highest-resolution observations possible in laser mapping,” Shrestha said. “We want to develop a system like no one else has developed. This would really change what could be done with this technology. The new system would have new features, be faster, smaller, and capture more data during each flight than we can today.” According to Shrestha, this system would use a much shorter pulse-length laser, increasing the number of points that could be mapped per second to 800,000. The impact on data accuracy and the amount of time needed in the air to collect the data would be enormous.
Book Review Designing geodatabases for transportation, by J. Allison Butler. © 2008 ESRI Press, Redland, California. ISBN 978-158948-164-0. 461p. — Reviewed by Landon Blake
Designing Geodatabases for Transportation is a guide to creating geodatabases for transportation networks. It describes many aspects of real-world transportation networks and explains how these can be modeled in a database with spatial capabilities. Designing Geodatabases is an excellent resource for GIS professionals who need to build and maintain their own spatial data sets representing transportation networks. It is also a valuable reference for land surveyors, cartographers, landuse planners, civil engineers, and other professionals who need to understand how transportation systems work or who deal with spatial data sets representing transportacouple of pages of each chapter. The tion features. In addition, programmers, book has broad and deep coverage of database managers, or GIS profession- topics. It is filled with attractive color als who need to build geodatabases of diagrams and figures which clearly any kind can benefit from the book’s convey information in graphical form. coverage of such topics as database The book is applicable to geodatabase normalization, versioning, alternative designing in general, and to transpormethods of modeling real world fea- tation systems in particular. tures, and behavior. The book has been written primarily content with users of ESRI software in mind as Chapter 1: Introduction. In this chapthe target audience. I am not an ESRI ter, a basic definition of transportation software user, yet, I found the book of is provided. Two common transportagreat benefit to me as well. Much of tion problems which can be solved the principles contained in the book using data stored in a transportation could be applied to other vendors of geodatabase are identified. The author GIS or database software, or to work describes transportation systems and admits that their diversity requires done with open source programs. Designing Geodatabases is well writ- something more than a one-sizeten. Technical terms are well defined fits-all geodatabase solution. A GIS and unfamiliar aspects of transporta- data model is described, and a short tion networks are explained. The book explanation of how we represent the is also well organized. Subject matter features of a transportation system in is logically separated into 19 chapters, a geodatabase is provided. This introand the chapters follow a logical order. ductory chapter also introduces agile A description of each chapter’s con- development and concludes with a tents is typically included in the first note on how the book is organized.
Chapter 2: Data Modeling. This begins with a discussion of how to design geodatabase objects which represent real world objects. It follows with basic instructions on geodatabase design. A six-step process is described for data modeling in a GIS, with the first three steps being examined in detail. Chapter 3: Geodatabases. This chapter starts with a list of benefits that a geodatabase offers when compared to a purely relational database. It then defines a geodatabase and looks at the components and structure of a geodatabase, including attribute domains, valid value tables, subtypes, and relationship classes. It concludes with an excellent discussion of normalization in a database with examples from the modeling of transportation systems. Chapter 4: Best Practices in Transportation Design. The focus of this chapter is representing transportation system features at various scales. This includes dealing with the difference between logical centerlines and carriageways, variable width roads, and multipoint intersections. Various methods for breaking road features into individual segments are discussed, as is the use of TIGER data for transportation layers, how to handle multiple street names in a geodatabase, emergency dispatch, and pavement management applications. The information in this chapter will benefit users who need to maintain geospatial datasets representing elements of a transportation system (such as a road network).
june 2010 ACSM BULLETIN 37
Chapter 5: Geometric Networks. This chapter describes a spatial network as a series of edges and junctions and explains the difference between simple and complex edges. It offers options to handling bridges and tunnels in a network. It follows with 8 pages of discussion of traffic demand modeling and a 6 page discussion of path finding.
Other topics focus on method to avoid storing duplicate information for route intersections and the use of traversals for path finding.
Chapter 6: Data Editing. This includes Chapter 9: Traffic Monitoring Systems. geodatabase versioning, check-in/check- The chapter defines the term “traffic patout procedures, and continuous editing. tern” and explains some common traffic (Permanently saving database edit his- statistics. It discusses three common tory.) Other aspects discussed in this types of traffic counters or monitors and chapter are supporting temporal charac- describes a traffic monitoring system. A teristics of transportation features and discussion of how to represent traffic preserving data dictionary history. Finally, monitoring system events in a geodareasons for separating the editing and tabase follows. In addition, three other major geodatabase-related issues are publishing databases are given. discussed: handling seasonal variations Chapter 7: Linear Referencing Methods. in traffic monitoring data, representing There are two problems associated with traffic monitoring site maintenance, and creating and maintaining large transpor- storing traffic count data. tation geodatabases. The first is defining the extent of linear transportation features. Chapter 10 through Chapter 17. These The second problem is that transportation chapters provide an in-depth review of facilities and their attributes change over six different transportation data modeltime. Linear referencing systems and ing, including the classic transportation linear events are discussed, as well as data model, the original UNETRANS data dynamic segmentation. The chapter also model, a revised UNETRANS data model, examines creating routes from individual a State DOT Highway Inventory data road segments, creating traversal (routes model (for editing), a State Highway DOT composed of other routes), and deter- Inventory data model (for publishing), and mining the difference between static and a multi-purpose transit geodatabase. dynamic traversals. Chapter 18: Navigable Waterways. Chapter 8: Advanced Dynamic Seg- In this chapter, a geodatabase design is mentation Functions. These include introduced which is suitable for use by using offset events to represent fea- local and state governments when contures that are not directly on a road ducting waterway inventories focused alignment. Using dynamic segmen- on recreational uses. Particular attention tation operations, the pattern, width, is given to abstracting waterways into and color of displayed transportation separate elements, such as flow line (or features can be changed. The chap- thalweg) banks and floodplain boundaries. ter includes tips on how to join linear The author draws attention to river reach event tables, import external event data included in the national hydrography tables into the geodatabase, and use data set and explains how these can be “elements” to represent the different used in the geodatabase design. Finally, physical parts of a transportation facil- linear referencing on waterways, waterity [ditch, guard rail, median, etc…]. way events, and river mile and channel
markers are discussed, as is using channel cross-sections and thalweg profiles in describing channel geometry. Chapter 19: Railroads. Chapter 19 begins with a description of the physical qualities of railroad tracks and the three (3) main types of track segments. It follows with a discussion of track intersection and track switches. It provides a simple scheme for creating geometry to represent track segments. It documents how railroad companies and ownership of railroad infrastructure can be stored in the geodatabase. The operation of railroad classification yards is then presented. The chapter concludes with an overview of typical track side structures that may be represented in a geodatabase and the attributes and markings of railroad and road intersections (grade crossings). conclusion Designing Geodatabases for Transportation is an excellent guide to the challenges and nuances of representing transportation systems in a GIS. The text offers excellent advice on general geodatabase design topics. Designing Geodatabases should be on the bookshelf of anybody involved in representing and model transportation systems in a GIS. I hope the author will take the opportunity to expand the book’s coverage of transportation on river and lakes systems and large freight railroad systems. I would also like to see added coverage of light rail, busses, and other forms of public transportation. A move to make the book more “software vendor neutral” would be an improvement. Designing Geodatabases for Transportation is a comprehensive guide to the application of GIS to the transportation domain. Many professionals would benefit if there were similar guides for the other common GIS application domains.
ACSM BULLETIN june 2010
Maximize Your Return on Investment with Mobile GIS When you combine geographic information system (GIS) with mobile technologies such as GPS and automated vehicle location systems, you create a powerful new solution that saves you time, focuses your resources, and captures accurate data. ESRIâ€™s GIS brings it all together for smart business operations and intelligent work management that will help you improve your business processes and maximize your return on investment. See the latest in GIS server and mobile technologies that provide an intelligent work environment for utility staff in the field and in the office, visit Integrate Surveying and GIS.
Update data in the field.
Copyright ÂŠ 2010 ESRI. All rights reserved. ESRI, the ESRI globe logo, ArcLogistics Route, ArcPad, www.esri.com, and @esri.com are trademarks, registered trademarks, or service marks of ESRI in the United States, the European Community, or certain other jurisdictions. Other companies and products mentioned herein may be trademarks or registered trademarks of their respective trademark owners.
6/16/10 10:17:42 AM
Two- and four-year surveying and geomatics programs “The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn. Toffler and Toffler, 1999 The relationship between two- and four-year surveying and geomatics programs in the U.S. is more complex and interconnected than in many other countries. This says something about the nature of our profession, but more importantly, it has a major impact on surveying education in the U.S.
—By N.W.J. Hazelton
There is a general belief that two-year programs are designed for producing technicians, while four-year programs produce professionals. This may well be the case in many engineering disciplines where the NCEES Fundamentals of Engineering examination is difficult to pass without having a four-year degree, but the relationship is more complex in surveying, and worth exploring. There are more two-year surveying programs in the U.S. than there are four-year programs. The fact that many states require no more than a two-year educational qualification for professional registration means that in many areas, the two-year Associate Degree is the pathway to becoming a professional surveyor. Naturally, educational programs in any discipline have varying degrees of excellence. If we restrict discussion to high-quality programs only, what are the characteristics of each type of program and what is their role in the future of surveying and geomatics in the U.S.?
The major constraints on any educational program are available time and non-discipline content required. In the U.S., most scientific and technical programs require about 25 percent of material to be of a general nature, 40
ACSM BULLETIN june 2010
e.g., English, humanities, and diversity courses. The standard programs are designed to run for two or four years, assuming well prepared, full-time students. For many educators, the problem is what to omit from the program, as there are more than enough important topics to fill four years, even without going into great depth in various areas. As the rapid pace of technological change has provided ever more material to include in programs, the need to develop a rational basis for leaving topics out of a program has elevated the importance of connecting to a program’s constituents. This is both a central theme in ABET accreditation and an opportunity for the profession to think about where it wants to go in the future. There is an expectation in the jobs marketplace that a two-year program should produce a technically competent person, ready to work under direction with current equipment, but not necessarily ready to take significant responsibility. With four-year graduates there is an expectation that they will have a longer learning period to be fully useful in the field, but that they are better prepared to take responsibility in the long term. They are also expected to become decision-makers and teachers, to understand and be
able to design measurement systems, to understand the why and how of technology and processes, and to be the profession’s future leaders. There is plenty of room for both types of graduates in the job market and the profession’s future. There is a general trend that tends to eliminate jobs from the lower levels of the skills range, but there are still enough opportunities in surveying that this should not be a major factor in people’s choice of a career level, and hence education choices. The big question for educators is: how do we best prepare people to fill these two somewhat different markets? Clearly the two-year graduates need a lot of hands-on experience and technical skills, while four-year graduates need more theoretical and professional skills, and can get by with rather less hands-on experience. As noted in a previous article (ACSM Bulletin, April 2010), the spatial information cycle can help with understanding the differences in each type of program and graduate. Two-year programs produce people whose strengths are in the measurement area and the basic structure and transformations area. The primary processes used here are variations of transformations, and so are largely algorithmic in nature. These can be taught using something closer to a cookbook approach (“get this, do that”). There is no need for a deeper understanding of the processes involved, and learning by experience is central to developing graduates who can do practical measurement work. Four-year programs come in focus later in the spatial information cycle. While their graduates are expected to understand the measurement and transformation processes, they also have to understand them at a deeper level, which means a sound theoretical understanding. They are expected to understand the results of these processes, and how to apply them
to very different processes. Knowing how errors work plays a significant part in this understanding. The focus is on pattern recognition, measurement system design, and moving spatial information and knowledge around the spatial information cycle. It is dealing with the planning for, and products from, measurement systems, not just doing the measurements. Teaching people pattern recognition means supplying them with a lot of patterns, and explaining those patterns, so that they can build up their own systems of recognition. Algorithmic methods don’t work here, otherwise you end up with the onesolution-fits-all-problems approach. Pattern-recognition-based problem solving is a hallmark of a professional with an understanding of his/her field at a deeper level. Clearly—going back to surveying education—there are two types of programs here, each with its own focus. It is fairly simple to work from their general objectives and develop suitable programs, albeit with the limitations of available time affecting the content for each.
The next question is, “How do the two-year programs articulate with the four-year programs?” There are several programs around the country where the first two years of a fouryear degree are actually a stand-alone two-year program. In some cases, the two-year programs are very much technically focused, but some graduates choose to continue with the four-year degree. In other cases, the two-year program is basically the first two-year period of a four-year program. Other programs are somewhere in between these two extremes. The problem we face in articulating between these two types of programs is that their principles and focus are radically different. Two-year programs focus on algorithmic approaches
and delineate fairly strict boundaries to the knowledge that students are required to obtain. The worldview for a two-year program graduate is necessarily limited by what is needed for the specific work expected after graduation. Fouryear programs focus on pattern-based learning, with much more flexible boundaries to the curriculum. Deeper understanding means much more theoretical knowledge, whence comes the need for some calculus and physics, together with lots of linear algebra and statistics, the mathematical language of modern surveying. The four-year programs have to provide the “proto-professional” with professional skills, which are often some collection of business, presentation, leadership, and management skills; an understanding of the history of the profession; and an understanding of the ethics and standards expected of professionals. Sometimes these courses can be part of the general education requirement; usually they cannot. For graduates of a two-year program there is a lot to unlearn before they can fully take advantage of a four-year program. First, we need to realize that the limits set upon the knowledge that a two-year program imparts are not relevant to the fouryear program: there is almost no limit to available knowledge in surveying and geomatics. For these reasons, the two-year graduate is unlikely to ‘know it all,’ no matter how well he or she did in the program. The second thing to realize is that while an algorithmic approach to problem solving may work in some cases, it is not the ideal solution. The algorithmic approach plays a role in turning a solution from an idea into achievement, but it is unlikely to generate the idea. When articulating students from two-year to four-year programs, we
june 2010 ACSM BULLETIN 41
may need to include an additional year of “unlearning,” depending upon the nature of the two-year program and the profile of the student. During this additional time, students can be introduced to the broader field of surveying and geomatics, obtain the foundational material in more advanced math and science, and generally change how they think about the profession as a whole. In the wider profession, we need to think about what we consider to be a well prepared surveying or geomatics professional for the future; decide what level of under-
standing they need to have in the surveying field; decide how well prepared they need to be to deal with technological change over decades; decide at what level they need to be compared to other professionals (e.g. lawyers); and decide what education level is required to meet these needs. In the middle of all this, the connections between two-and four-year programs can also be decided.
Professionals are made, not born. While there may be an inclination
PERMAMARK® The Original Plastic Survey Marker Choice of Surveyors for over 30 years! ŸPermanent Identification ŸCost Effective ŸInstalls Fast and Easy ŸStamping Included ŸEliminates Electrolysis ŸNon-Corrosive ŸFastest Service Sizes Fit: 1/2”, 3/4” & 1” pipe and 3/8”, 1/2”, 5/8”, & 3/4” rebar Colors: STANDARD: Yellow ALTERNATE: Red & Orange
For FREE samples Write, call, or fax
towards professional approaches, the need for education in a discipline’s patterns means that professionals are not a ‘natural’ occurrence. Professions in the modern sense did not appear until about 500 years ago. The conceptual approach of the spatial information cycle has implications for the wider profession, and the schisms within it. When our view of the profession is narrow, constrained by our earliest experiences, there doesn’t seem to be anything beyond the boundaries we perceive. We need to take ourselves beyond what we consider to be the limits of the profession to recognize new possibilities—for ourselves, our businesses, our profession and our relationship with the society in which we find ourselves. This is very difficult and unsettling for many people. It’s easier to dismiss what’s beyond the boundary as not being “real surveying.” Ultimately, we need to unlearn our earliest impressions, our way of doing things in the past. We need to relearn skills, perceptions, and ways of thinking. As the Tofflers implied, this can be as difficult as learning to read and write. With the speed of change in surveying and geomatics, we risk becoming illiterate in our own profession, unless we can unlearn and relearn much of what we thought we knew.
Barnette Industries, Inc. Western Division 5860 Laird Road Loomis CA 95650 Tel: 916-652-7050 Fax: 916-652-7173
Eastern Division 53 Poquito Road Shalimar FL 32579 Tel: 850-651-2500 Fax: 850-651-9995
TO LEAVE YOUR MARK, USE PERMAMARK
ACSM BULLETIN june 2010
References Toffler, A., and Toffler, H., 1999. Foreword. In: Gibson, R. (ed.) Rethinking the Future: Rethinking Business Principles, Competition, Control and Complexity, Leadership, Markets and the World. London: Nicholas Brealey Publishing.
Berk Uslu, Dimitris Sideris, and Grant Howerton (L-R)
The Super-Information Highway Virginia Tech acquires new data-collection technology to assist Virginia DOT’s public-private highway asset maintenance partnerships—by Don Talend
hink of roads, and your mind’s eye conjures an image of asphalt. But that image would miss the many other features that make roadways into intelligent integrated systems: signs, traffic signals, lighting, drainage structures, guardrails, to name just a few. Transportation departments like that of Virginia, however, keep a very close eye on these assets. They’re vital for protecting the safety of the motoring public. Yet, just because there is a good reason to have quality highways does not mean that the labor-intensive task of maintaining them is easy to justify, especially now that state budgets are being squeezed by declining tax revenues. Undaunted, the Virginia Department of Transportation (VDOT) has decided to improve the management of roadway assets through a partnership with Virginia Tech in Blacksburg, Va.
The Smart Road partnership project
Virginia Tech, which is developing a powerful new roadside asset-assessment technology, was the right choice for the private–public partnership initiated. The research is
sponsored by the Center for Highway Asset Management Programs (CHAMPS), and Topcon Positioning Systems has contributed the system’s key tool, the new IP-S2 mobile mapping system (see sidebar on p. 46). Research on the monitoring system has been progressing fast. By spring 2010, Dr. Jesus M. de la Garza, the Vecellio Professor in Civil and Environmental Engineering, and graduate students Grant Howerton, Dimitris Sideris, and Berk Uslu of Virginia Tech’s Vecellio Construction Engineering & Management Program, had laid the groundwork for the effort. The monitoring would be done using Topcon’s IPS2 mounted on a vehicle driving along the highways and collecting data for evaluation back at the office—a much more efficient way than having crews evaluate asset condition on foot. “We have had a close relationship with VDOT for 22 years—since I’ve been here,” said de la Garza. The university has provided continuing education for VDOT personnel for years, and de la Garza planted the seeds for CHAMPS more than 10 years ago, after kicking around the idea with VDOT leadership over dinner during a workshop. june 2010 ACSM BULLETIN 43
The Virginia Smart Road, as the project came to be known, is a full-scale, closed test-bed research facility managed by the Virginia Tech Transportation Institute (VTTI)—the school’s largest university-level research center—and is owned and maintained by VDOT. The facility is consists of a two-lane, 2.2-mile-long section of pavement, an infrastructure equipped with 400 electronic sensors, and a fiber-optic backbone. Both faculty and students conduct research on the Smart Road, a portion of which is equipped with 75 weather-making towers that produce rain, snow, fog, and even ice. The facility also boasts road surveillance, a signalized intersection, a mechanism for reproducing various lighting situations, 14 experimental pavements, and 600 pavement designs. From a computer-equipped control center, researchers can observe highway traffic and driver performance. Studies have been conducted on new pavement markings, new road signs, pedestrian safety, and new vehicle head lamps. “The Smart Road is really a 2.2-mile laboratory,” de la Garza observed. “Instead of having a room with microscopes and machines that slice DNA, this lab happens to be a 2.2-mile road.”
Legislation: A boon to partnerships
The VDOT–Virginia Tech partnership benefited from the enactment, in 1995, of the Public–Private Transportation Act, which allows private entities to enter into agreements with state agencies to construct, improve, maintain, and operate on state transportation facilities. Three months after the bill was signed into law, Virginia Maintenance Services 44
ACSM BULLETIN june 2010
Dimitris Sideris prepares the dedicated laptop computer to receive data from the IP-S2 prior to a collection run on Virginia Tech’s Smart Road
(VMS) was awarded a contract to administer and maintain about 250 miles (or about 20 percent) of Virginia’s interstate highways for five years, renewable for another five. “That was a radical, 180-degree shift from the way VDOT had maintained the interstates,” said de la Garza. “VDOT was not going to tell us what to do, how to do it, let alone when to do it. The only thing they gave us were the performance targets that they wanted us to attain. The rest was up to us—we’re expected to be innovative.”
assets—with faculty and surveyors having to drive to the various regions and often collecting data on foot. “This method was labor-intensive as well as unsafe,” said de la Garza. “We thought it need not be so. If we could demonstrate that we can collect zillions of gigabytes of data with the IP-S2 mapping system attached to a vehicle driving at 65 miles an hour, and then clean up and analyze the data in the office using a semi-automated process, we would make the work more efficient and less costly.” The work is in two phases. “The first step is when technicians identify roadside assets and features recorded by the IP-S2 on a film. In phase two, computer programs, such as “machine vision,” are used to find signs and other road features and evaluate their condition.”
Topcon’s IP-S2 uses high-accuracy GPS, an inertial measurement unit, 360-degree digital imaging, and a laser scanner to collect data for geospatial information systems.
After ten years, the Secretary of Transportation, assisted by Virginia’s Attorney General, drafted updated guidelines in accordance with amendments enacted by the 2005 General Assembly, and VDOT established the Turnkey Asset Maintenance Services (TAMS) program. Currently, the entire interstate system in Virginia is being managed under performance-based contracting. Another consequence is the addition of about 1,000 miles to the 250-mile VDOT–Virginia Tech pilot project. To ensure impartial oversight of contractors, CHAMPS was tasked to act as third-party administrator of performance assessment. In the past five years, interstate assets in 13 regions of Virginia have received full roadway asset inventory and condition assessment. Virginia Tech developed a method for tracking the achievement of performance goals by contractors. “It’s basically a report card,” de la Garza chuckles. “We’re a university, so, naturally, we keep grades.” The need to evaluate contractors’ performance inspired the use of mobile condition assessment system technology. Between 2001 and 2009, CHAMPS researchers produced numerous GIS-based reports on the condition of highway
Two of the original three members of the “Smart Road crew” at Virginia Tech–Howerton and Sideris—received masters degrees in May and have since moved on to careers in construction management and civil engineering. It’s the third graduate student, Uslu, who will continue the research while he studies for his Ph.D. Howerton’s contribution to the Smart Road project was research which compared traditional data-collection methods with the IP-S2 mapping method, both in terms of accuracy and time consumed. Like Sideris, Howerton was very interested in the management of low-capital highway assets. “We did some literature review and found that low-capital assets have been overlooked,” said Sideris. “Most contractors and DOTs try to make sure that the pavement is in good condition and bridges are strong enough to withstand heavy traffic—which makes sense. You want to have a safe highway. But low-capital assets are important for safety too. If a road sign is not where it should be, this can cause many problems. The highway is one system, and everything affects everything else. Everything is important.” Uslu, who came to Virginia Tech via the State University of New York and the Technical University in Istanbul, Turkey, will focus his research on automating the collection of highway asset data and their monitoring. “I plan to come up with algorithms which would make sense of the laser data collected by the IP-S2,” said Uslu. “I want to have an algorithm which will detect an asset and evaluate its condition. There should also be an algorithm for detecting the presence of signs and failing them if they are damaged.” VDOT personnel will be notified when the asset meets, or falls short of, predetermined working condition parameters. Uslu will likely collaborate with researchers at the Center for HumanComputer Interaction in developing these algorithms.
june 2010 ACSM BULLETIN 45
Viewing assets in a 360-degree radius from the vehicle carrying the IP-S2 system
An automated assessment system will result in “a huge cost savings for VDOT because 70 to 80 percent of the cost of highway maintenance programs is in sending people out to collect data,” said Uslu. “Not only is this way of doing business labor intensive, but mistakes can be made during the collection.” The Center for Highway Asset Management Programs received the IP-S2 in the fall of 2009. Richard Rybka, Topcon’s mobile mapping specialist, provided support via phone and e-mail during its set up. A van was rented from the Fleet Services Department at the university, and a special roof rack was built to mount the IP-S2. The mapping unit also has a custom-designed hood, which is required to comply with Virginia Tech’s security regulations.
Fine asset details and dimensions from point cloud data
design software that can process point cloud data, compare the visualization with a map you already have, and then you can get the cut and fill done quite accurately,” said Uslu with considerable satisfaction. [Photo credit: Unless otherwise stated, all photos are courtesy of Don Talend.]
From data to a model
By the end of May, more than 20 asset data collections were performed. Moving along a highway, the IP-S2 maps data in a 360-degree radius, at a distance of 10, 20, or 30 meters. The progress of the IP-S2 is monitored on a laptop. The computer uses a Web browser to communicate with the IP-S2 via an Ethernet cable. According to the students, “the real fun” begins when data from different system components are integrated in one system for “geocleaning.” Five operations are performed in Geoclean. These include processing the IP-S2 raw data for subsequent operations; conducting inertial post-processing to create a geospatial vehicle trajectory; generating a LiDAR (Light Detection and Ranging) point cloud; converting compressed image files from a Ladybug camera into GIS maps; and registering the digital image sets to the trajectory and point cloud. While Sideris performs the sequential upload to a GIS model, Howerton explains that in order to get the best map view one can change the settings of the Ladybug CapPro viewer using software from Point Grey Research, the camera’s manufacturer. “The settings we use are 18 frames for every tenth of a mile,” he said. A key software program used to view, analyze, and extract features from the processed data is Spatial Factory which merges imagery with point cloud data layers. “Do you see these bubbles?” Sideris asks. “They are the raster data from the Ladybug.” He clicks on one of the bubbles and up pops an image of a Smart Road section on the screen. He then pans the image in a 360-degree radius around the van, virtually putting himself back on the Smart Road. Clicking on a LiDAR point on the model reveals the X, Y and Z coordinates collected by GPS. The point-cloud layer makes it possible to incorporate such data as topography and the reflectivity of pavement markings into the GIS model. This additional information is revealed on top of the underlying image. “If you have an excavation, a cut-and-fill project, you can get the dimensions of the area, upload the data to any computer-aided 46
ACSM BULLETIN june 2010
Like Digital Flypaper The mobile mapping system that Virginia Tech’s Center for Highway Asset Management Programs is using to help the Virginia Department of Transportation monitor low-capital highway asset works like flypaper with five different adhesives—except that it snags the geospatial data and images of everything near it, not the physical objects themselves. Topcon Positioning Systems’ IP-S2 incorporates three redundant positioning technologies with 360degree digital imaging and laser scanners. The system consists of a dual-frequency, dual-constellation Global Navigation Satellite System (GNSS) receiver which establishes the geospatial position of the vehicle; an inertial measurement unit (IMU) which tracks vehicle attitude (pose); and external wheel encoders the captured odometry data from the vehicle. Integration of these technologies creates a three-dimensional position for the vehicle and provides accurate tracking in challenging or denied GNSS environments. A high-resolution digital camera provides 360-degree images. The system records and time-stamps inputs at the rate of 15 nanoseconds. The IP-S2 also uses 3D laser scanners with an effective range of 30 meters. Every second, the scanners collect 45,000 x, y, and z points which are used to obtain accurate geospatial positions for assets. Traditionally, LiDAR data have been collected from the air; because this system collects the data from ground level, it provides critical data that cannot be obtained from aerial surveys.
The FIG REPORT VOL. V. NO. 3
John D. Hohol ACSM FIG Forum Head of Delegation
XXIV FIG Congress Facing the Challenges, Building the Capacity
The 2010 FIG Congress was the biggest FIG congress ever held, with more than 2,200 participants from over 100 countries in attendance, including 58 from the U.S [see sidebar). The technical program included more than 800 papers presented in more than 150 technical and flash sessions, workshops, and special seminars. opening ceremony
The opening ceremony at the Sydney Convention and Exhibition Centre (SCEC) attracted close to 2,000 people. They were welcomed by representatives of the aborigines Cadigal tribe who performed “songlines,” traditional rituals believed to connect people and land and mark routes. Jonathan Saxon of the Surveying and Spatial Sciences Institute (SSSI) gave the welcome address on behalf of President Michael Giudici who was not able to attend the congress because of health problems. Other welcome greetings were presented by Tony Kelly, Minister of Lands, New South Wales, and Congress Director Paul Harcombe. In his opening address, FIG president Stig Enemark stated that “the key challenges of the new millennium are climate change, food shortage, energy scarcity, urban growth, environmental degradation, and natural disasters.” All these issues relate to governance and management of land, which relates directly to each surveyor and each national surveying association. “Sustainable land administration systems provide clear identification of the individual land parcels and land rights attached to these parcels,” said Enemark. “This information on
The ACSM FIG Forum Delegation Curt Sumner (Commission 1. Professional Standards and Practice); Steve Frank (Commission 2. Professional Education); Robert Young (Commission 3. Spatial Information Management); Jerry Mills (Commission 4. Hydrography); Dave Doyle (Commission 5. Positioning and Measurement); John Hamilton (Commission 6. Engineering Surveys); Don Buhler (Commission 7. Cadastre and Land Management); Bob Foster (Commission 9. Valuation and the Management of Real Estate); and John Hohol (ACSM FIG Head of Delegation). In addition, Daniel Helmricks, the winner of the ACSM FIG Forum FIG Congress Fellowship participated in the activities of the Young Surveyor Network and presented his paper at the Congress. [see photo on p. 48]
the relationship between people and land is crucial and plays a key role in adaptation to climate change and in preventing and managing natural disasters.” According to Enemark, “the land management perspective and the operational component of integrated and spatially enabled land administration systems need high-level political support and recognition. Surveyors—the land professionals—are in a core position.” In her opening remarks, the Hon. Dr. Marie Bashir, Governor of New South Wales, said that she has been “increasingly interested in, and appreciative of, the contribution of surveyors—the
june 2010 ACSM BULLETIN 47
The ACSM FIG Forum Delegation to FIG Congress 2010 Inset: Daniel Helmricks
modest and often unsung heroes of civilization.” As part of the opening day festivities, Dr. Bashir also unveiled a full-size statue of James Meehan, one of the surveyor pioneers, which was on display at the exhibition during the congress. Dr. Tim Flannery—one of Australia’s leading thinkers and writers, Australian of the year 2007, and chair of the Copenhagen Climate Council (a think tank on climate change)—delivered the keynote address. “Surveyors,” he said, “are the custodians of an enabling technology that is critically important to our future. Surveyors should take a leading role, not only in monitoring climate change, but in explaining it to the broader public. Surveyors operate well in harsh conditions.” plenary sessions
Four plenary sessions were held, focusing on: FIG achievements, 2007-2010; spatially enabled society; climate change, natural disasters, urban growth, and land governance; and Google Earth and internet approaches. Highlights from these sessions are given in the sidebar. The plenary sessions were complimented by very popular after lunch talks. Presentations were made by Johannes Schwarz (Leica Geosystems) and Brent Jones ( ESRI) on how GIS and modern technology can response to natural disasters such as the recent earthquake in Haiti. Prof. John McLaughlin, a legend in the surveying profession, spoke about the Fourth wave of property reform. technical sessions and commissions
The technical program ran for four days with 10 to 12 parallel sessions. Because of the huge number of papers, some were presented in new a flash session series that worked quite well and had also a good attendance. Many sessions were standing room only. About ten per cent of the papers had gone through a peer review process. This allowed for more opportunities for academics to attend the congresses. The peer review option seems to also have raised the overall quality of the papers. In addition to regular technical sessions, invited paper session and commission plenary sessions were 48
ACSM BULLETIN june 2010
1. FIG Achievements. President Enemark presented a most impressive list of FIG events, events jointly held with the World Bank, FAO, and UN-HABITAT, and other partners, and projects and achievements described in the eight new FIG publications that were launched in Sydney. 2. Spatially enabled society. Dr. Abbas Rajabifard, President of the Global Spatial Data Infrastructure Association (GSDI), noted in his keynote that “in a spatially enabled society, a society manages information spatially by using a spatial component.” This requires data and services to be accessible and accurate, well-maintained and sufficiently reliable for use by the society, even by members who are not spatially aware. Mr. Santiago Borrero, President of Pan American Institute of Geography and History (PAIGH), emphasized the need for strong links between politics and the numerous disciplines of the surveying profession to enable better land governance. He also challenged FIG to play a leading role in this endeavor. The session ended with a presentation by Surveyor General Warwick Watkins, NSW, on a world class best practice approach using the State of New South Wales as an example. 3. Challenges facing surveyors and society. Dr. Paul Munro-Faure from the UN Food and Agriculture Organization (FAO) described the principles of good land governance, and Dr. Mohamed El-Sioufi from UN-HABITAT spoke about climate change and sustainable cities. Dr. Daniel Fitzpatrick (Australia) spoke on addressing land issues after natural disasters. 4. Google Earth and the Internet. Apart from key speakers in this session, Dr. Mary O’Kane and Ed Parsons, FIG Vice President Matt Higgins addressed the increasing role of positioning techniques. In Australia, positioning infrastructure is expected to deliver productivity gains with potential cumulative benefit of $67 to $124 billion over the next 20 years in agriculture, construction, and mining alone. arranged. These included presentations by Keith Bell from the World Bank and Dr. Clarissa Augustinus from UN-HABITAT. general assembly
Fifty-four member associations participated in the first session of the assembly and sixty-seven took part in the second session which included the elections for president and vice president (66 members were able to vote). Three strong candidates, all current FIG vice presidents, ran for the post of FIG president. Teo CheeHai from Malaysia, who received 27 votes in the first
The FIG REPORT VOL. V. NO. 3 • • • • •
An aborigine welcome to FIG Congress 2010
round and 33 in the second, was elected to be the next FIG president. He is the first FIG president from Asia, and his term is for four years beginning January 1, 2011. In addition, two new vice presidents were elected; their terms of office expire on December 31, 2014. They are Dr. Chryssy Potsiou from Greece, who prevailed in the second round, and Prof. Rudolf Staiger (Germany), who received most votes overall. The term of office of the third vice president, Dr. Dalal S. Alnaggar from Egypt, will expire on December 31, 2012. The General Assembly also appointed new chairs for the ten FIG Commissions, to serve from 2011 to 2014. They are: • Commission 1: Leonie Newnham, SSSI, Australia • Commission 2: Prof. Steven Frank, ACSM, USA • Commission 3: Prof. Yerach Doytsher, ALSI, Israel • Commission 4: Dr. Michael Sutherland, CIG, Canada, Trinidad and Tobago • Commission 5: Mikael Lilje, ASPECT, Sweden • Commission 6: Dr. Gethin Wyn Roberts, ICES, United Kingdom • Commission 7: Daniel Roberge, CIG, Canada • Commission 8: Wafula Nabutola, ISK, Kenya • Commission 9: Prof. Frances Plimmer, RICS, United Kingdom • Commission 10: Robert Šinkner, CUSC, Czech Republic Other new appointments were: Christiaan Lemmen from The Netherlands becomes the new chair of OICRF; Kate Fairlie from Australia will chair the Young Surveyors Network; and David Martin from France will chair the Standards Network. The next FIG Congress will be held in Kuala Lumpur, Malaysia. This marks the first time that the FIG Congress will be organized in Asia. The FIG currently has 103 member associations, including five joining this year: Albania Association of Geodesy, Ordre des Géometrès Experts du Benin, Geodetic Association of Herceg-Bosnia (Bosnia and Herzegovina), Cyprus Association of Valuers and Property Consultants, and Nepal Institution of Chartered Surveyors. The ten new affiliate members approved in Sydney. The new affiliate members are:
Afghan Land Consulting Organization, ALCO, Afghanistan Italian Society of Photogrammetry and Topography, SIFET, Italy Swiss Federal Office of Topography, Swisstopo, Switzerland National Cartographic Center, Iran Afghanistan Information Management Services (AIMS), Afghanistan • Bureau of Land Management Cadastral Survey, USA • State Geodetic Administration of Republic of Croatia (DGU), Croatia • Survey of Israel, Israel • Ministry of Lands and Mineral Resources, Fiji • Agency for Land Administration and Cadastre of Republic of Moldova Three corporate members were admitted to FIG membership: Geotrilho Topografia Engenharia e Projecto lda, Portugal, Geoweb S.p.A., Italy, and Coordinates Magazine, India. FIG now has 89 academic members, including: • The National Research Institute of Astronomy & Geophysics, Geodynamics Department, Egypt • The College of Estate Management, United Kingdom • Brno University of Technology, Czech Republic • College of Technology, University of Houston, USA • Center for Soil Protection and Land Use Policy (Zentrum für Bodenschutz und Flächenhaushaltspolitik am Umwelt), (ZBF-UCB), Germany • School of Rural Estate and Planning, Reading University, United Kingdom. New memoranda of understanding were signed between FIG and the Global Spatial Data Infrastructure Association (GSDI) and the Pan American Institute of Geography and History (PAIGH). The General Assembly agreed to establish the FIG members’ database by the end of 2010. This web-based database will offer information on surveying and the profession in each FIG member country. Eight major publications were launched in Sydney, with eight more expected to be launched by the end of 2010. The declaration of the outcome of FIG Congress 2010 will be published after June 14, the deadline set for comments on the draft Sydney Declaration presented by President Enemark at the second General Assembly. Marie Bashir, Governor of New South Wales, and Warwick Watkins, Surveyor General, NSW, welcome guests to Government House
june 2010 ACSM BULLETIN 49
The FIG REPORT VOL. V. NO. 3
Esri’s 2010/2015 Updated Demographics Data Forecasts Socioeconomic Trends Full roster of current data variables gives complete picture of U.S. demographics
Esri has released its 2010/2015 Updated Demographics data, which offers more than 2,000 data variables, including current-year estimates and 2015 forecasts for 11 different geographies from national to block group levels. This data can help identify areas of high unemployment, adjustments in the housing market, rising vacancy rates, changes in income and consumer spending, and increased population diversity. Agencies, businesses, and organizations can use the data to analyze trends, identify growth, and reveal new market opportunities. “Updated data variables, such as population, housing, age, income, and home value, ensure that analysts can conduct their research with the most accurate information available, particularly for fast-changing areas,” says Catherine Spisszak, Esri data product manager. “Esri’s data strategy for 2010 is to provide a classic update—the full range of Esri Updated Demographics data that analysts and knowledge workers have relied on for decades.” The Updated Demographics data is being released on a flow basis. Currently, more than 60 variables are available in the Demographic & Income Profile Report from Esri Business Analyst Online, Esri’s on-demand market analysis tool. All the updated variables will also be available soon as ad hoc data and in Esri Business Analyst (desktop and server). “Esri pays close attention to economic and social trends and how they influence the needs of businesses, consumers, and citizens,” says Lynn Wombold, chief demographer and manager of data development at Esri. “For example, although signs indicate economic recovery from the recession, the impact on the average consumer continues to be very personal. Housing is down, foreclosures are up, income is declining, and population growth is slowing. The challenge of successfully weathering the current economy underscores the importance of having access to accurate information. Current data can track critical changes and preclude the cost of being wrong.” For more information, visit www.esri.com/ datawhatsnew or call 1-800-447-9778.
ACSM BULLETIN june 2010
Director Generals Forum: This first ever event was attended by director generals from 50 national mapping and cadastral agencies. On the agenda were issues facing national mapping and cadastral agencies. Workshop on UN millennium development goals challenges in the South Pacific region: 50 representatives from small island states attended. The outcome of the seminar will be published as a report of this Congress. Workshop on the history of surveying; attendees learned about legendary surveyors saw historical places of interest to surveyors in Australia. Technical tours were organized to enable participants learn about surveying in Australia, especially in Sydney and New South Wales. FIG 2010 Exhibition: Sponsored by 50 companies, the exhibition showcased a number of the latest technologies. Social events: Several impressive events were organized, including the welcome reception which was held at the recently renovated Sydney Town Hall; a reception hosted by Governor Marie Bashir in Government House; the FIG Foundation Dinner organized as part of a Sydney Harbor cruise; a Gala Dinner in Darling Harbor; and visit to the homes of Australian colleagues.
In his closing remarks, President Enemark, whose term ends this year, emphasized that “land professionals need to increase their role in developing sustainable cities. They need to take a more engaging and leadership role in the area of climate change. The partnership with UN agencies is the key to recognition and enhancing the status of the global surveying community—leading to more effective solutions to global land issues.” The next FIG conference will be the FIG Working Week in Marrakech, Morocco, May 18-22, 2011.
L-R: Clarke and Bramble
Earth science has a field day The dream begins with a teacher who believes in you, who tugs and pushes and leads you to the next plateau, sometimes poking you with a sharp stick called “truth.” —Dan Rather. — by Stephen C. Letchford, LS
At Matoaca Middle School in Chesterfield, Virginia, Cheryl Clark, earth science teacher, and the school’s vice principal Shannon Bramble “tugged and pushed” to create, with the help of local surveyors, an earth sciences field day like no other. The field day, now in its second year, was held in March, at the school’s campus and at Matoaca Park. Five survey stations were established. To enrich student’s knowledge and experience, five student teams were created to test their mettle with conducting topographic surveying and mapping, determining elevations from angles and heights, and using GPS and other current surveying methods. At the stations, the students participated in a Robotic Relay Course and tried to solve four challenges: a Law of Cosine Challenge, an RTK Rover Topo Challenge, a Trig Elevation Challenge, and the Challenge of Distance Stadia. In the Law of Cosine Challenge, which was judged by Andy Bowles, LS, senior surveyor with Tommy Barlow & Associates, students were tasked with calculating a missing line using angle measurement, chained distances, and the Cosine Rule.
The Robotic Relay Course was a practical demonstration of some of the strengths and challenges one faces when surveying with a robotic total station. The course was given by Bob Leyden, Survey Coordinator with Gardy & Associates, PC, who also timed as they navigated a rather challenging course without losing lock. Each of the five student teams took the Trigonometric Elevations challenge. To solve it, they had to calculate the elevation of various points on the ground using a survey instrument measuring slope distances and zenith angles only. Mike Carris, LSiT, president of Carris Technology Solutions, advised students on how to use math to solve this challenge and judged this contest. The Stadia Distances challenge was set up to illustrate how to calculate the total perpendicular distance between five points and a survey baseline. William Ware, Jr., LS, and Stephen Letchford, LS, organized this challenge. Organized by Tommy Nichols, LSiT, sales associate with Allen Precision, the RTK Rover Topo challenge was, by all accounts, a favorite since the students could see what was being located in real time, on the controller computer june 2010 ACSM BULLETIN 51
Calculating a missing line, with Andy Bowles, Ls
screen. Nichols, being the consummate instructor that he is, had an unending amount of enthusiasm for each group of students that came his way. His extremely generous donation of ten compasses for the winning team was very much appreciated by all involved. What a difference a year made in this fledgling program at Matoaca Middle School! In 2009, forty students from one class participated; this year, sixty-five students from three classes participated. Six surveyors, eight parent volunteers, and four teachers were involved in pulling this great event together. Word of Matoaca’s field day has reached Dr. Jeremy Lloyd, Director of Science for Chesterfield County Schools, who expressed great interest in expanding the program to include multiple middle and high schools within the Chesterfield region, so that more students could be exposed to the practical application of the mathematics taught in county schools.
Robotic Relay; Clarke navigates Leyden’s challenging course
Measuring slope distance and angles to calculate elevation
Reading the stadia hairs on a rod 100’ away
Many thanks to all who participated in this year’s field day, but most of all to the students who showed appreciation for demonstrations of what surveyors can do with the math and earth sciences being taught by their teachers. We hope that one day, a few of the bright minds we had the privilege to meet will think back on this event and decide to become part of the next generation of land surveyors in Virginia.
ACSM BULLETIN june 2010
If you would like to take part in next years event, need further information, or have an idea on how we might improve the program going forward, please feel free to contact Stephen C. Letchford, LS (804)267-1258; (firstname.lastname@example.org) Taking the RTK Topo challenge with a total station
Long before GPS, you had to have portolans Library of Congress holds conference on origins of portolan charts —by Neely Tucker
John Hessler, mathematical wizard and the senior cartographic librarian at the Library of Congress, slipped into the locked underground vaults of the library one morning earlier this week and approached a priceless 1559 portolan chart on the table before him, sketched in the hand of Mateo Prunes, the Majorcan mapmaker. The nautical map of the Mediterranean and Black seas is inked onto the skin of a single sheep. The chart is a rare representative of one of the world’s greatest and most enduring mysteries: Where and how did medieval mapmakers, apparently armed with no more than a compass, an hourglass, and sets of sailing directions, develop stunningly accurate maps of southern Europe, the Black Sea, and North African coastlines, as if they were looking down from a satellite, when no one had been higher than a treetop? The earliest known portolan (PORT-oh-lawn) chart, the Carta Pisana, appears in about 1275—with no known predecessors. This is perhaps the first modern scientific map and is contrasted sharply to the “mappa mundi” of the era, the colorful maps with unrecognizable geography and fantastic creatures and legends. The method by which it was created bears no resemblance to the methods of the mathematician Ptolemy and does not use measurements of longitude and latitude. And yet, the map “seems to have emerged with stunning accuracy from the seas it describes,” one reference journal notes. No one today knows who made the first maps, or how they calculated distance so accurately, or even how all the information came to be compiled. “The real mystery is that if you took all the notebooks the sailors used in making these charts, along with the coordinates and descriptions, you still couldn’t make this maps,” Hessler said, tapping the glass that covers the ancient vellum. Hessler, 49, one of the world’s leading experts trying to decode the mysteries of the maps, presented some of his dazzlingly intricate research at a May 25th confer-
ence at the library, “Re-Examining the Portolan Chart: History, Navigation and Science.” Sponsored by the Philip Lee Phillips Society, the fundraising arm of the library’s Geography and Map Division, the conference drew about 200 academics, donors, and collectors to a day-long session which tackled the ancient mystery of the portolans (from the Italian word for “ports”). It was one of those moments in which Washington, invariably portrayed as a dry city of bureaucrats, revealed itself as a place filled with people who could, with a little fictional help, just as easily be the basis for a ripping good thriller. “People think maybe the Romans, or the Phoenicians, or even aliens made the first portolans and they’ve been lost,” says Evelyn Edson, author of “The World Map: 1300-1492” and one of the conference’s speakers. “They certainly seem related to the introduction of the compass, in the 11th century. But there’s nothing at all to explain how they were made. . . . It’s been very tempting for people over the years to try to make up the answer.” “The ancient Greeks and Romans had traditions of mapmaking,” Hessler said. “There’s Ptolemy, and there’s a line of progression. But here, the art of making maps june 2010 ACSM BULLETIN 53
explodes out of nowhere. It appears to be a true invention of the Middle Ages.” Hessler’s approach to studying the charts isn’t cultural or nautical—it is entirely mathematical. He has taken 22 of the few hundred portolan maps known to be in existence and measured them against modern maps of the same area. He uses, say, 100 points of comparison on each map and then applies complicated algorithms to calculate the differences between each point on each map. (We could go into your basic Euclidean transformation method of calculating scales of error, and of course the Helmert transformation, but since these calculations take three or four months for each map, let’s just move along.) Hessler compares these two maps on a computermodeled overlay, with the scale of error then plotted onto a “deformation grid.” He is then able to see where the charts were more accurate and where they were less accurate, from which he infers where sailing and close observation took place, and which areas were more loosely charted. This, in turn, reveals more about the birthplace and methodology of the map. For example, the maps were good in the various seas of the Mediter-
ranean but terrible once out in the Atlantic, rounding up to the British Isles. “That tells me different sources were used to make the same map,” Hessler noted. “So now you start to discover where those different charts came from and how they got to the mapmaker.” The charts’ usage began to come to an end during the trans-Atlantic exploration. For all their regional accuracy, the mapmakers did not know how to calculate for the curvature of the Earth on a flat map. Across the Mediterranean, they could take you from port to port because the distances were comparably small. Over the Atlantic, if you set out for modern-day Miami using their charts, you’d wind up on Long Island. Still, the portolans were reliable guides to the known world for 400 years, and they have concealed the secrets of their origins and methods for another four centuries, leaving the answers to the realm of novelists and storytellers. “Even with all the research that has been done on portolans the world over there’s not a single question about them that we can definitively answer,” Hessler said, looking up from the Prunes masterpiece.
Designing Tomorrow: America’s World’s Fairs of the 1930s exhibition opens October 2010
In the midst of the Great Depression, tens of millions of visitors flocked to world’s fairs in Chicago, San Diego, Cleveland, Dallas, San Diego, and New York where they encountered visions of a modern, technological tomorrow unlike anything seen before. Architects and industrial designers like Raymond Loewy, Norman Bel Geddes, Henry Dreyfuss, and Walter Dorwin Teague collaborated with businesses like General Motors and Westinghouse to present a golden future complete with highways, televisions, all-electric kitchens, and even robots. The National Building Museum’s new exhibition Designing Tomorrow: America’s World Fairs of the 1930s is the first-ever exhibition to consider the impact of all six American world fairs of the depression era on the popularization of modern design and the creation of a modern consumer culture. On display from October 2, 2010 through July 10, 2011, Designing Tomorrow brings together more than 200 never-before-assembled artifacts from the six fairs. The exhibition further explores how the 1930s world’s fairs were used by leading corporations and the federal government as laboratories for experimenting with innovative display and public relations techniques, and as grand platforms for the introduction of new products and ideas to the American public. Designing Tomorrow is organized into seven thematic galleries: Welcome to the Fairs, A Fair-going Nation, Building
a Better Tomorrow, Better Ways to Move, Better Ways to Live, Better Times, and Legacies.
As a companion to the exhibition, Designing Tomorrow, a collection of essays, will be published in September 2010. The collection celebrates the influence and impact of the world’s fairs of the 1930s and the complicated negotiations brokered between tradition and avant-garde design in the cutting-edge work that was presented. The book is edited by Robert W. Rydell, assistant professor at Montana State University, and Laura Burd Schiavo, curator of Designing Tomorrow and assistant professor at the George Washington University. The collection is published by Yale University Press. Also in conjunction with the exhibition, the Museum is developing a variety of education programs intended to further examine the impact of world’s fairs on modern design. Visitors to the Museum can also take advantage of free, docent-led tours of the exhibition beginning October 23, 2010. For details and up-to-date information on the exhibition and associated programming, please visit www.nbm.org or call 202.272.2448.
ACSM BULLETIN june 2010
CONNECTIONS THAT WORK FOR YOU.
November 8–10, 2010 The Mirage, Las Vegas
Don’t miss Trimble Dimensions 2010—the positioning event of the year! It’s the one place where you can make connections and gain insight into positioning solutions that can transform the way you work. Be inspired by our panel of visionary guest speakers. Increase your knowledge base from hundreds of educational sessions that focus on surveying, engineering, construction, mapping, Geographic Information Systems (GIS), geospatial, infrastructure, utilities and mobile resource management solutions. Register now and you’ll learn how the convergence of technology can make collaborating easier and more productive to gain a competitive edge.
To find out more about Dimensions 2010, visit www.trimbleevents.com ©2010 Trimble Navigation Limited. All rights reserved. PN# 022540-039 (6/10)
6/9/10 4:19:29 PM
Network Rover Receiver
RTK...the Easy Way With the GRS-1 network rover, no external modem is needed . . . it’s all built in. By attaching an easy-to-connect external antenna and accessing a local RTK network, centimeter accuracy is instantly achieved. Operate the GRS-1 as a standalone, hand-held unit for sub-foot navigation and mapping. Learn more at topconpositioning.com/grs1
Complete network rover kit—
It’s time. topconpositioning.com
3/9/10 11:32:16 AM