Education and Training
GIS Monitor - Archives
Products and Companies
 Subscribe to
GIS Newsletter  Submit News

2005 August 25


This issue sponsored by


News Briefs, Back Issues, Advertise, Contact, Subscribe/Unsubscribe

If, for some reason you cannot read this document, visit:

Editor's Introduction

This week I continued my education in remote sensing — and share with you my conversations with Tom Dooley, of Fiber Specialists Inc., David Betzner, of Aero2, Inc., and Kenneth A. Stumpf, of Geographic Resource Solutions, Inc. (GRS).

— Matteo

Fiber Specialists Inc.

Tom Dooley is a project manager for Fiber Specialists Inc., a company that handles installation, construction, maintenance, engineering, GIS, and facility management for communications systems. He discovered GIS about five years ago, when he was looking for a better alternative to CAD to map his clients' networks. A week ago his company completed an inventory, down to the level of individual jacks, of the entire telecommunications network of the City of Phoenix, using Network Engineer, a telecom GIS package by Telcordia.

"We build fiber optic telecom networks," Dooley told me, "mostly city-to-city, underground construction, etc. I have been doing this since 1971. For years there's been a tremendous lack of as-built documentation. Around 1990 we started using AutoCAD to document what was out in the field, but it was much too complicated. One of our guys went to a convention about GIS for airports and when he came back we realized that GIS was the perfect solution for us."

Despite his company's success, Dooley is modest about its GIS expertise: "We're getting into GIS slowly - we are not GIS people, we are telecom people. I'm coming into this as a telephone guy that wants to give the customer a better record than what he has now." Fiber Specialists finds "outside plant" (that is, any part of a network that is outside a building), inventories it and assesses its condition. The company then publishes the information securely on-line, or privately on the customer's LAN, so that the client's staff can read it using a Web browser. "GIS got too complicated for most of our customers, who are mostly former Telecom guys who were promoted to supervisors," says Dooley. "So, we had to make the data accessible to them via a browser. We cannot breeze through ArcGIS easily, but we make it easy for our customers to do so on their browser."

The project for the City of Phoenix involved integrating the city's telecom network documentation with its SAP accounting package. "They wanted to use SAP to keep track of where their money went," Dooley told me - and we both quipped that the new system allows the city to do so almost literally! I asked Dooley why the project took more than two years to complete. "One reason it took us so long," he told me, "was the difficult ArcGIS - SAP integration." Data, he explained, is stored in one of the two databases and Network Engineer, which acts as the front end for the ArcGIS engine, knows where to find it in response to a query. The tables are populated on a website using Microsoft's version of the Active Server Pages standard and are then updated via the city's intranet. "It is now a living document," told me Mike Maier, Fiber Specialists' GIS specialist. "The beauty of this," he added, "is that both ArcGIS and Active Server Pages use Microsoft Access as their database."

The company has also mapped portions of the cable network operated by the Arizona Department of Transportation. For that project, it mostly used aerial photos procured and owned by county governments as backdrop, but also paid for a fly-over that took pictures with a resolution of about 3 inches per pixel. To make it easier to identify lids to the state's manholes in the photos, crews painted them all white prior to the fly-over. They also placed a banner on a roof, to serve as a kind of copyright marker - small enough that only somebody aware of it would be able to locate it. "In the photos, you can see the manhole covers," Dooley told me, referring to the junction boxes that have "pull wires," which are detectable ends of copper wire that can be used to trace a network.

This image shows some locating and mapping done by Fiber Specialists for an engineering firm that was working for the Arizona Department of Transportation. The company's task was to find and analyze the conduit and pullboxes on both side of the freeway, in anticipation of using the infrastructure for future fiber optic cables to support an FMS (Freeway Management System) project. In ARCGIS, each dot is linked to a table listing what is in that particular pullbox — such as "3-3 inch conduit and 1-1.5 inch conduit, all with detectable pulltape, and capped." A jpeg is also linked to each dot, showing the open pullbox, sometimes from different angles. Clicking on a line segment will bring up a description of the ductbank between two pullboxes.

Using handheld pen tablet computers running ESRI ArcPAD and connected to Trimble Ag132 DGPS receivers, company staff then traced the cables from each pull box and mapped the network. They took digital pictures of each pull box and later associated them with the boxes' locations using the time stamps from the digital cameras and from the GPS receivers. They also added notes - such as "lid is crushed," "no pull wire," or "conduit is full of mud." The end product was a small CD, containing pictures of the open conduit boxes: clicking on an image brings up a diagram of the wiring. This is all handled in a browser and the company includes the free ESRI ArcReader on the disk. "This way, everybody can get the information," Dooley told me. "It is a way of multiplying and dispersing our knowledge about their network."

His company's products, Dooley points out, are "not a substitution for compliance with blue-stake laws," which mandate that all underground facilities must be located and marked prior to excavation, trenching, or other digging. Rather, they enhance the subsurface utility engineering (SUE) process.

What's the biggest challenge Dooley faces in this work? "I can find people who are good at GIS, GPS, surveying, mapping, or telecoms," he told me, "but the hardest part is finding people who are good at telecom and at any two of the other disciplines."

Aero2, Inc.

David Betzner is the president of Aero2, Inc., a geospatial information company located in Philadelphia, Pennsylvania. It opened in December 2002 and specializes in providing large scale photogrammetry mapping for engineering and survey firms. "We do a wide range of types of mapping at all different scales," Betzner told me. "We specialize in mapping for land development firms and planners. Our client base consists mostly of land surveyors, engineers, municipalities, and state agencies. We can work nationwide and have accepted work from out of state and out of region, but the majority of my clients are local; 75 percent are from the tri-state area."

I asked Betzner first how he procures his ground control. "We start with aerial photography. Then we get a surveyor to go out and collect the ground control points. Then we use SoftPlotter, from The Boeing Company, as our turnkey system for aerotriangulation, stereo-compilation, and to generate digital orthophotographs."

Why do you do your own surveying, instead of buying ground control points from companies that have an archive? "It depends on the quality of the data that is out there to purchase," Betzner answered. For most of his work, he needs very accurate vertical position. "Terrain extraction is critical and for that I have to have a licensed land surveyor out there occupying the sites, whether pre-targeted or post-flight photo ID points. I need really good GPS data. I really don't see a lot of sharing going on. I've used my own control redundantly, but that's for my own projects."

I pressed Betzner further on this issue: what about using GPS/IMU? The bigger projects, he told me, such as city-wide and state-wide updates, use the newer technology. "I cannot take that liability risk. My average size projects, 90 percent of them, are between 25 and 200 acres. I use airborne GPS for larger projects, but I cannot risk GPS/IMU when they are moving dirt around on the ground. I used it, tested it, and have not been completely satisfied - but in some applications it will do."

What are you working on right now? "Right now I'm doing eight miles of linear strip mapping for the New York Department of Environmental Protection for a new sewer project. My feature list has 140 layers. We fly low and show lots of detail. About half the time we deliver at 1 inch = 20 feet."

What is your final product? "Eighty percent of our deliveries end up in AutoCAD, because that is what that surveying community uses for the most part. The other twenty percent ends up in a GIS database."

What has changed in your corner of the geospatial industry in the past few years? "I've been able to streamline some of my production process with automation tools. Five to eight years ago I had to manually input break lines and mass points for orhtophoto generation. Now we can use auto-correlation techniques to greatly reduce our production costs."

What's on your wish list? "I hope for further improvement in auto-correlation and automatic feature extraction tools, because that will cut down on some of my time."

What do you think of Microsoft Virtual Earth and Google Earth? "I use those and Global Mapper, to review and take a quick peak at my area, when generating proposals. Some of our clients are using [those systems] to scout out their areas. The mass-market products may have hurt some people but have helped some people as well. If you want to do something really generic, they are good enough. But to get really robust products you have to go with a really robust firm. Otherwise you don't use your system to full capacity and that is a real waste of money."

How do you see the market expanding? "Townships want to come on board with GIS: if they don't already have it they want it. If they don't know about it I tell them. I am including GIS training in my proposals."

Geographic Resource Solutions, Inc.

Kenneth A. Stumpf is Director, Remote Sensing Applications, for Geographic Resource Solutions, Inc. (GRS), a company based in Arcata, California. Before I had a chance to ask him any questions, he told me that he loves his job — which these days includes traveling to beautiful locations in national parks and visiting unique natural features to collect ground truth to use in land cover mapping projects. "Often when working in the field," he told me, "I feel very fortunate that I actually get paid to do what I do!" He described what he sees every day in the field as a "visual smorgasboard." "Last summer, in Wrangell-St. Elias National Park, Alaska, as we flew through a pass into a valley near the Nabesna Glacier, everything we saw up and down the valley looked yellowish. As we got closer to the ground, we were all amazed: we found that we were in an area covered with one to two inches of moss in the gravel bars between the larger rocks. Later, as we progressed down the valley everything was dominated by a whitish caste: the moss on the ground had changed to a dense cover of lichens."

As we began to discuss the substance of his work, it quickly became apparent that he has strong opinions on his craft and that GRS's approach to land cover mapping differs significantly from traditional image classification and photo-interpretation efforts. Current plans are to test the two approaches side-by-side in an upcoming U.S. Geological Survey (USGS) project in which GRS is participating as one of two mapping teams. The teams will both map a national park to be selected for its diversity in landscape, vegetation, and terrain. (However, because data collection typically accounts for about two thirds of the cost of a mapping project, the two mapping teams will coordinate their field data collection and accuracy assessment efforts.) At the end of the project, the USGS will test the two sets of maps for accuracy, robustness, and usability for different resource management applications. "We had our first meeting at the end of June to begin planning this project," Stumpf told me, "and we might get started by next summer."

As an extreme example of what happens when image classification is not done correctly, Stumpf cites the case of several National Forests in Oregon, including the Deschutes National Forest. Some National Forests have stopped using image classification mapping capabilities and gone back to photointerpretation, because the agency's image classification data for that region "turned out to be inaccurate and unusable and they believe that photointerpreted results are more reliable," he told me. "I think that image classification has had a 'black-eye' for about the last ten years. Results were promised and then never really delivered. Clients believed the image processing community and it turned out that it was not as simple as promised. I've been told that that mapping project ended up costing the federal government nearly 15 million dollars in contractor and agency costs."

According to Stumpf, photointerpretation for large project areas that involve hundreds of thousands of acres is more expensive and does not always generate the level of detail needed. In Alaska, the NPS has contracted with GRS to map some of their national parks, after previous attempts with several different companies did not generate the results they needed. "They finally received a data set that had the level of detail they wanted from us," he told me. "They don't need categorical or legend-based maps, which is what people typically produce using image classification and photointerpretive efforts. Those maps have no further information in the database than the general categorical class information. For most purposes — such as fire fuel class mapping and modeling, wildlife habitat suitability studies, and change detection — resource managers need more than that. We map the components of the different features and types — like GIS layers; they are the building blocks of the landscape and can be used and analyzed in many different ways."

So, what distinguishes GRS's approach from that of its competitors? Stumpf listed four differences:

The first one is the illumination correction. In one of its first projects, in the early 1990s, GRS, as it later realized, had systematically misclassified old growth. "Our original maps," Stumpf recalls, "were 65-70 percent accurate. We found a definite correlation of misclassification with slope and aspect: most of our errors were on the northerly facing shaded slopes." So the company studied the literature on illumination corrections, developed and tested some of the models, applied them to the imagery, and reprocessed all the data. "Our map accuracy, after performing the illumination correction, jumped to between the mid-80s and the low 90s for our canopy cover, tree size, and type map estimates. We've used the illumination correction procedure ever since. When we started working in Alaska we were told it wouldn't work up there due to the high latitudes (low sun angles) and rugged terrain, but it has worked great. Most people never even do it and using uncorrected imagery is a leading cause of confusion and misclassification."

Second, GRS maintains every individual field training site as a separate class during the image classification efforts, so that each one has a very specific ecological characterization. "Most people cluster their training data by some type characteristic they are trying to map and wind up with high spectral variances. We don't cluster anything. We are 'splitters' rather than 'lumpers.' Our methodology is quite different from what is taught in school and what others are doing. Image processing is like making ice cream, in that there are many different flavors one can make. Some approaches generate bland generalized results while others generate many different variations."

The third difference is an intimate knowledge of the terrain being mapped. One of Stumpf's rules is that everyone involved in land cover image classification must actually visit the area they are mapping — rather than just sit in front of a computer screen back at the office. "I'm an advocate that everyone who is doing image classification has to have their feet on the ground. We collect quantitative estimates on the ground, we don't eyeball anything unless forced to. We get out of the truck and off the road." This way, he explains, they are able to come up with estimates of stand characteristics, such as structure, that cannot be seen in the imagery. "We see it on the ground in our data," he says, "and since we have confidence in our classification results we can relate the ground data to the individual classes in the final map."

The last difference is that GRS prequalifies all field sites before visiting them. "We preprocess the imagery to identify areas that are suitable candidate sites for collecting field data. A suitable site must be spectrally homogeneous and large enough for an adequate field sample." Using this approach GRS knows which sites are frequently occurring and which ones are rare. They can then plan their field data collection effort so that they cover the diversity of types while not being overly redundant in their sampling. "We very rarely reject training data from a candidate site, as we know we can develop a pretty homogeneous signature for that site. When field data collection costs are so high you don't want to collect unneeded or erroneous data."

GPS receivers are one of GRS's key tools. Stumpf always carries two and has them operating continuously all day long — because, he says, "typically one will fail to receive data at some point." For his purposes he does not need very high accuracy. "I'm happy if I'm +/- 5 meters," he told me. After processing the candidate site database to select the next set of field sample locations, he loads them as targets into his GPS receivers. GPS is also vital to Stumpf for another reason: including fuel and the pilot's fee, helicopter services may cost as much as five to six thousand dollars a day; therefore, he needs to know his position at all times with a very high degree of confidence, so as not to waste precious time flying over the wrong areas, searching for sites, or collecting data in the wrong location.

Every night, when he gets back to his base camp, Stumpf downloads the data he has collected that day and overlays it over satellite imagery and the candidate site database. "I go through a debriefing like after a bombing run," he explains. "We can see how many of our targets we've hit and adjust our future plans accordingly."

Because he is constantly looking for unique areas, he uses the training data he's collected the day or week before to decide where not to visit the next day. "If we follow our plans, we are basically assured a very diverse set of ground sample points that we use to train our image classification processes that go into making the final map."

According to Stumpf, it is a common misconception that good mapping requires very high-resolution data. In fact, he points out, such data can lead to significant problems, such as splitting a tree canopy into two parts — shaded and illuminated — because the side facing the sun has a different spectral signature from the shaded side. While GRS can model differential illumination due to slope and aspect, they cannot model the illumination on every tree crown. "Vendors," he says, "want us to go to high-resolution imagery. But high-resolution does not necessarily mean that you can identify more features. Instead, I want a pixel that is larger than the size as the objects I am mapping. The pixel is a sample unit and you'll get less variance within pixels of a class and more variance between classes." In four weeks of field sampling, GRS collected field data that will be used to map the 18-million acres in the Wrangell St. Elias National Park and Preserve in Alaska, which contains five of the nine tallest mountains in North America. According to Stumpf, the final cost will only be about 2 cents per acre for their efforts in this project.

Stumpf has been working with computers his entire professional life. "My masters thesis was a deck of cards," he told me, referring to computer punch cards. Therefore, he's lived through the explosion in computer processing power and storage capabilities. His laptop computer can now do in an hour the processing that a mainframe computer used to take ten days to do. In 1989, he recalls, one megabyte of memory cost about one thousand dollars, while now my $180 PDA has 39 megabytes of memory. Yet, he complains, most people are still using image processing tools developed when memory and data storage were very expensive. "The tools can stand to catch up with the technology," he told me.

According to Stumpf, there are two models of how to handle land cover mapping using image processing and both can be represented by a pyramid. In the typical approach, the pyramid is balanced on its point, which represents the amount of field data collected; the middle of the pyramid represents the modeling efforts, assumptions, and criteria used to generate map estimates; and the broadest part represents the outcome: a broad spectrum of uncertain results that depends on the assumptions made. Those who use this approach "spend most of their time tweaking their assumptions in an attempt to generate the outcome they think is right." The alternative approach is to flip the pyramid flat on its base, representing a substantial amount of accurate field data, the mid-portion represents the processes used to analyze and synthesize the results, and the point (top) represents the very focused, detailed, and accurate set of map information that is the final product without further editing or manipulation. GRS uses the latter model.

I asked Stumpf what is on his wish list for image processing technology. The data collection technology is adequate, he told me, "except that we are losing some of the capability due to the scan line problems with LandSat 7." It is the software tools and methodology that now have to catch up. "Most of them have not changed substantially in 20 years. People are still making the same old mistakes while using classification processes — then say that image classification doesn't work. Stumpf thinks some people now rely on predictive modeling, rather than image classification, "because they were never successful using image classification tools and gave up on image classification. Most people are using classification models that are bound to generate uncertainty, because that is what they learned in school. You need to forget much of what you learned in school. In fact, I find it easier to hire an inexperienced image analyst who doesn't have a lot of bad image classification habits."

Finally, I asked Stumpf about the impact on the remote sensing industry of the advent of mass-market satellite imagery and aerial photography, via the likes of Google Earth and Microsoft Virtual Earth. His conclusion is that this development is bad news for smaller remote sensing and photogrammetry companies. He thinks the casual user will opt for the free imagery rather than pay for something better. His company, he pointed out, has been using Terraserver imagery this past year to gain information about areas we are involved in mapping. "I have downloaded thousands of images for virtually nothing, whereas I never would have acquired the imagery at what it would cost as an individual image acquisition project." However, he cautioned, "some of the imagery is as old as 1994 and you cannot assume that the ortho correction is great. You just have to recognize its limitations and act accordingly."

News Briefs

Please note: I have culled the following news items from press releases and have not independently verified them.


RMSI, a geospatial information and software services company, and its consortium partners have won a large data capture mapping contract with Land Registry in Ireland. The consortium, which also includes Landmark Solutions, a UK company, and Proteus Solutions Ltd., based in Cork and Dublin, Ireland, will convert, store, and manipulate the Registry's 32,000 map sheets, which contain an estimated 2.4 million land parcels. The database of maps will link to the corresponding Land Registry folio.
     The new five-year contract commenced in June 2005 and will involve converting existing paper map records for over 1.6 million properties into an electronic format as part of the Land Registry's large-scale Digital Mapping Project. RMSI, which also has offices in the United States and the UK, expects to hire an undisclosed number of new employees in fulfillment of the large contract.

AGL Resources, an Atlanta-based energy services holding company, has selected ESRI for its enterprise GIS standard. The company acquired an ArcGIS site license in 2001 to support the migration of the legacy GIS for two subsidiaries, Atlanta Gas Light (AGL) and Chattanooga Gas (CGC). AGL Resources is currently leveraging the license to replace legacy systems of additional subsidiaries as well as extending the license to cover new acquisitions. AGL Resources aspires to be the preeminent distributor of natural gas on the East Coast, providing service through six utilities, two gas storage facilities, and an asset management company and serving more than two million customers. The site license includes ArcView, ArcEditor, ArcInfo, ArcSDE, and ArcIMS software.

Intransix, a provider of wireless location-based applications, has added ESRI's ArcWeb Services maps and data into its WorkoutGPS wireless and Web-based application for GPS-enabled mobile phones. ArcWeb Services are hosted Web services that include map data and on-demand geospatial capabilities needed to add real-time locations, addresses, points of interest, dynamic maps, and routing directions to Web and wireless applications. Map data storage, maintenance, and updates are handled by ESRI, eliminating up-front map database, geospatial software, and ongoing maintenance expenses.
     WorkoutGPS is a personal fitness management tool, enabling users to measure and collect outdoor exercise regimen data with simple-to-use GPS-enabled wireless phones. The application consists of a Java-based handset to record distance, speed, and caloric-burn metrics in addition to a personal Web journal in which users can store their training logs, set account preferences, download new workout profiles to phones, and link personal accounts to partner Web sites such as and
     WorkoutGPS training logs provide complete mapping coverage for the United States and Canada, including street, topographic, and aerial maps, made available through ESRI's ArcWeb Services. WorkoutGPS was recently used by the Webcor Endurance Cycling Team competing in the Insight Race Across America bike race, the longest race of its kind in the world, covering 3,059 miles.
     WorkoutGPS is currently offered on commercially available NEXTEL GPS-enabled Java phones. Additional wireless service provider support is planned for future product releases.

Local Matters Inc., a provider of search and content solutions to media publishers and directory assistance service providers, will incorporate Telcontar's Drill Down Server (DDS) geospatial software platform to provide maps, routes, and turn-by-turn directions for its multi-category Internet search solution. Telcontar is a supplier of software and related services for the location-based services (LBS) industry.
     As enabled by Telcontar, Local Matters can present local, multi-category search results as points-of-interest on a map for a tool for both advertisers and users. As Telcontar Drill Down Server delivers flexibility to accommodate and host various types of data and applications, Local Matters can provide its customers with customized and dynamic offerings. With Telcontar DDS, a variety of real-time applications, including local search, turn by turn directions, and traffic monitoring and rerouting, can be delivered to multiple devices, including mobile, PC, or telematics units.

Varion Systems, the software development and value-added reseller division of GeoAnalytics, Inc., has been retained by the White House Utility District (WHUD), Tennessee to extend the integration between Azteca Systems' Cityworks asset maintenance management software and the Tennessee One Call System. The not-for-profit Tennessee One Call is more commonly known as a "call before you dig" service and acts as an advance notification service to operators of underground facilities anywhere within the state.
     Varion Systems was initially hired by WHUD to develop a program that would automatically create service requests in Cityworks based on individual locate tickets received as e-mail messages from the One Call system. The second phase of the project will focus on leveraging the District's GIS to spatially filter locate notifications that fall outside the District's service territory. The District has already seen improved efficiencies in the handling of One Call locate notifications and this integration extension will continue that trend.

Leica Geosystems has supplied to the New York State Department of Transportation 84 GX1230 dual-frequency GPS survey receivers for RTK data and GeoOffice post-processing software for 11 locations. The company will also supply technical support and service under a five-year service agreement. The GX1230 receivers are designed with Leica Geosystems' new SmartTrack GPS measurement engine and incorporate self-checking RTK algorithms and a self-explanatory graphical user interface. The GPS instruments, with magnesium alloy construction, are built to the toughest MIL specifications to withstand extreme field conditions. The SmartTrack technology ensures reliable centimeter-accuracy solutions at distances of 30 kilometers or more.

The National Geospatial-Intelligence Agency (NGA) has selected Feature Analyst automated feature extraction software for early technology insertion into the NGA production environment. In 2006, the NGA intends to deploy the product, made by Visual Learning Systems (VLS), across the agency's Integrated Exploitation Capability (IEC) workstations. The Feature Analyst and LiDAR Analyst software provide toolsets for enhanced 2D and 3D feature extraction and land cover classification from monoscopic, stereo, and LIDAR imagery sources. Since its inception the NGA has conducted comprehensive testing of Feature Analyst. In addition, NGA is providing research funding to VLS to further improve the product.


The Enterprise for Innovative Geospatial Solutions (EIGS) will hold its first EIGS Exchange on August 30, in Jackson, Mississippi. The event is designed to highlight groundbreaking technologies from the geospatial technology industry, provide an opportunity to learn more about the innovative products and services from Mississippi's geospatial industry cluster, and hear about the latest developments at the local, state, and federal level.
     EIGS Exchange will feature presentations and speakers from all levels of government and private industry. Of special note are addresses by Clint Brown, Director of Software Products for ESRI, and Karen Yeager, of Dutko Worldwide, who recently served in the White House advising the President on homeland security issues. To give an update of activities at the federal level, the event is featuring comments from Yosef Patel of the U.S. Department of Energy and Chuck Dull of the U.S. Department of Agriculture.
     From the state perspective, EIGS Exchange attendees will hear from Gray Swoope of the Mississippi Development Authority, Colonel Don Taylor and Bud Douglas of the Mississippi Department of Human Services, and Craig Orgeron of the Mississippi Department of Information & Technology Services. Joel Yelverton of the Mississippi Association of Supervisors and Renee Newton, GIS Director for the City of Tupelo will provide a glimpse of what is going on at the local level.


During the past year, the Urban and Regional Information Systems Association (URISA) has been updating past workshops and has introduced several new ones for presentation at its annual conference, which will take place October 9-12 in Kansas City, Missouri. At the conference, the following new workshops will be presented: Field Force Automation and mGovernment; 3D Geospatial: Project Implementation Methods and Best Practices; and An Overview of Open Source GIS Software. The association's website has complete session descriptions and presenter information. The conference includes 13 full-day pre-conference workshops, nearly 60 ninety-minute educational sessions, a keynote address, a comprehensive exhibit hall, networking events, and about 200 presenters.

GIS Monitor Back Issues

Advertise with Us

You can reach more than 17,000 GIS professionals every issue by sponsoring GIS Monitor. For more information, email us.


Please send comments and suggestions to:

Matteo Luccio, Editor
GIS Monitor

Ultimate Map/GIS Directory — Your search is over!

GIS Monitor is published by:

GITC America, Inc.
100 Tuscanny Drive, Suite B1
Frederick, MD 21702 USA
Tel: +1 (301) 682-6101
Fax: + 1 (301) 682-6105


If you wish to subscribe or unsubscribe visit our subscription page.
Copyright 2005 by GITC America, Inc. Information may not be reproduced, in whole or in
part, without prior authorization from GITC America, Inc. GIS Monitor is a GITC publication.