|Sign-Up for Free Exclusive Services:||Portals|||||eNewsletters|||||Web Seminars|||||dataWarehouse.com|||||DM Review Magazine|
|Information Is Your Business||Advanced Search|
The Future of Analyzing Data on Maps
Last year was an exciting and busy time for Web mapping applications. Google Earth, Microsoft Virtual Earth and several startups pushed out impressive applications while the traditional geographic information industry (GIS) continued to play catch up. Though Google and Microsoft have done a strong job integrating satellite imagery, three-dimensional views and buildings, a piece of the puzzle is still missing.
The real value of traditional GIS was its ability to enable better geographic decisions with desktop application analysis tools. However, the extensive training and background in GIS necessary to conduct analysis and understand the results remains a large barrier to growth. For example, one of the leading GIS providers offers more than 150 different training classes, and then there are infrastructure considerations, equipment costs and the challenge of integrating these technologies into mainstream IT systems.
The addition of major Web-based mapping applications to the marketplace has put dynamic pressures on the GIS industry. These companies are now working to better serve their corporate constituency with Web-based applications that don't require massive investments or knowledge acquisition.
Today's ability to understand data on maps provides an interesting juxtaposition. On one side of the equation are popular and easy-to-use Web mapping applications, but they do not have the right analytic tools to enable decision-making. On the other side are expensive, proprietary GIS systems that offer robust analytical tools and decision support that are not intuitive or open.
In order to break this impasse, the industry must address factors preventing intuitive Web-based mapping applications that provide true geographic analysis. The challenges fall into two broad areas: how to deliver intuitive analysis to the Web browser and how to make data available to feed the analysis.
The Analytics Problem
Bridging the considerable gulf between Web mapping and traditional GIS requires harnessing the power of legacy geographic analytics into the easy-to-use world of Web mapping. Today's Web mapping applications plot data with push pins on maps, but they do not solve problems. In order to move beyond pushpins, the market needs geo-analytics (the geographic analysis of data). Users need the fundamental ability to discern if location A is better than location B.
There are two substantive barriers to adding geo-analytics to Web mapping applications. The first is the computational and processing limitations. The second involves mass users' lack of technical knowledge and the difficulty creating ease of use.
Computational limitations represent the largest barrier to successfully creating Web-based geo-analytics. A wide variety of geo-analytics can be performed in traditional GIS, but they often rely on database-intensive processing and imaging techniques that are slow on a desktop computer and cannot be delivered through a browser at all. It's no longer acceptable for a desktop application to take minutes or even hours to solve an analysis. Such delays will not be tolerated in today's Web 2.0 environment.
Web applications are a different universe; studies have shown that four seconds is the maximum amount of load time the average user is willing to wait.1
The time it takes for many desktop GIS analytics to run is not acceptable to a general Web audience. Even GIS companies that have tried to Web-enable their products have not been able to support geo-analytics beyond simple thematic maps, driving directions and reports. If analysis is going to be delivered to the new large audience that Web mapping applications have created, it will need to be dynamic, fast and efficient.
The "ease of use" barrier for a broad, non-technical user base is formidable. Traditionally geo-analytics have required a robust understanding of both the geospatial and mathematical concepts underlying them. To run analyses, users have needed a sound understanding of mathematical concepts ranging from interpolation to kernel density analysis and Gaussian distance decay functions.
Without either the necessary education or extensive training, traditional geographic analytics are well out of the reach of the typical user. However, the Web 2.0 phenomenon has opened mapping to a huge new nontechnical audience. The key to effectively leveraging that exposure is to solve real-world problems for this new user set, thus providing them real value. In short, new apps must be simple, a common paradigm in successful technology applications and devices.
The Data Problem
In order for analysis tools to provide value, they need data to feed them. Further, because the majority of Web applications are free, access to large of amounts of free data that is easily digested by nontechnical users is required. This would also provide value to corporations by allowing them to access large amounts of public data for marketing purposes, including census-based demographic information.
Geographic data can be found in a disconnected hodgepodge of locations all over the Internet. One can see glimpses of these in the thousand of map mash-ups scattered across the Web, but they only allow users to look, not analyze. There are a variety of data repositories across the Internet and some very good meta-lists providing links to them. However, there is no universal way to search, identify and utilize geographic data in a meaningful way without going to each repository and shifting through reams of metadata and lists.
Federal, state and local governments as well as nongovernmental organizations (NGOs) produce the vast majority of geographic data, much of which is free; its creators (collective governments and NGOs) want the general public to have it - data needs to be organized and accessible.
Currently, geographic data is stovepiped. There is no current network method to provide value beyond the single data set. For instance, there is value in seeing crime rates for Chicago, but that value pales in comparison to the power of mashing up crime rates, housing values, schools, police stations, emergency call boxes and neighborhood watches in order to find the safest place to invest in a residential real estate project.
The more data, the more value is possible and the more problems potentially can be solved. Open source, proprietary and third-party vendor data need to all be easily reachable in a few clicks if Web mapping applications are really going to drive value to their users.
Current Efforts and the Future
The popularity of mash-ups has lead to the emergence of a variety of startups searching to monetize the phenomenon. Sixty percent of all mash-ups utilize a mapping component, holding some of the best potential for success.2 The integration of mapping components has driven increasing innovation to solve both the data and analysis problems.
On the data side, companies such as Platial, with its People's Atlas, are driving user-generated geographic data. Platial does not harness the world of existing data, but it has created a very easy-to-use interface that allows people to create and share geographic data. Various real estate startups are developing analytic tools to help users make better housing selections. Zillow, Trulia, Neighboroo and Hotpads all offer heat maps to help their users look at data and statistics that relate to residential neighborhoods. While these programs are limited to single-variable thematic maps, the trend illustrates the growing demand for map analytics to utilize increasingly available data.
Much work remains to truly fuse the disparate worlds of GIS and Web mapping, but the possibility of creating intuitive geographic intelligence for the masses is within sight, and with it, the potential to help a wide new range of individuals and businesses solve their geographic problems in an intuitive, inexpensive way.
The ability to create a new market with user-friendly applications is one of the great potentials of Web 2.0 technologies, and mapping applications are at the forefront. Google demonstrated the financial power of opening Internet advertising to the mass market, creating access for small and medium-size businesses. There's similar potential to provide the broad market with the means to analyze geographic-based issues and hypotheses with Web mapping applications. It will certainly be an exciting road to travel.
For more information on related topics visit the following related portals...
Analytics, Business Intelligence (BI), Data Visualization and Web Analytics.
Sean Gorman is CEO and founder of FortiusOne. Prior to founding FortiusOne, Gorman was a research assistant professor at George Mason University's School of Public Policy. Gorman has also served as VP of R&D for a telecommunications mapping firm and Director of Strategy for a Washington, DC based technology incubator. His research was focused on infrastructure security, and has been featured in the Washington Post, Wired, Der Spiegel, Associated Press, CNN, MSNBC, Fox, CNBC and NPR. Gorman also serves as a subject matter expert for the Critical Infrastructure Task Force and Homeland Security Advisory Council. He may be reached at email@example.com.