Surveying, Mapping and GIS

Exploring all aspects of mapping and geography, from field data collection, to mapping and analysis, to integration, applications development and enterprise architecture...

  • Geospatial Technology, End to End...

    Exploring all aspects of mapping and geography, from field data collection, to mapping and analysis, to integration, applications development, enterprise architecture and policy
Showing posts with label EPA. Show all posts
Showing posts with label EPA. Show all posts

EPA ARRA Mapper

Posted by Dave Smith On 12/02/2010 09:20:00 PM 15 comments

Since joining EPA I've been engaged in a wide variety of projects and efforts - one which we are currently getting out the door is an upgraded mapper for EPA projects funded by the American Recovery and Reinvestment Act (ARRA), otherwise known as Stimulus or the Recovery Act.

The major categories of projects receiving EPA funding via ARRA include Superfund Hazardous Waste Cleanup, Leaking Underground Storage Tanks, Clean Water State Revolving Fund (typically wastewater treatment), Drinking Water State Revolving Fund (potable water), National Clean Diesel Campaign, and Brownfields.

 The idea was to provide more granular data across the various programs where EPA has been getting ARRA funding to projects.


The mapper reports on a quarterly basis, in concert with the ARRA reporting requirements, and was built on the ESRI Flex API. As a quick overview, it shows statewide figures, as choropleth map, with summary tables:

Visitors can click on the menu to view awards by program category, or to view all awards, for example:


The pushpins indicating awards can then be selected, and info boxes will pop up with the details. In the example below, we asked the mapper to show "Chelsea, MA" and turned on Clean Diesel awards, and clicking on the map pin, we get the goods, two awards for Chelsea (note, spelling of "Collabrative" comes directly from the database):


This application should improve transparency, with the direct intent of showing tangible benefit to users in showing what's going on right at the community level. As for lessons learned, the technology was far less of a challenge than the learning curve of how government works, and navigating my way through various EPA offices and stakeholders and gaining their acceptance and participation. My many thanks to all those who helped out.

MyPropertyInfo

Posted by Dave Smith On 9/10/2010 10:54:00 AM 5 comments

As I have been ramping up on EPA's Facility Registry System over the last couple of months since coming on board with EPA, I have also had the opportunity to work on a number of other projects - one recent one that's rolled out is MyPropertyInfo.

The most truly fun thing about working in EPA's Office of Environmental Information is that they are involved in a lot of collaborative, cross-cutting efforts, so I get exposed to a lot of different things across the agency. As an example of this, in working with EPA's Freedom of Information Act (FOIA) officer Larry Gottesman and FOIA staff, they were pursuing an idea of greater accessibility toward reducing FOIA requests, such as in the case of common requests for data which actually is already being published by EPA, but which may be scattered across separate locations in the agency.

One example of this is MyPropertyInfo - http://epa.gov/myproperty/

Here, we sought to address frequently-asked questions about properties. This type of basic background and screening is highly useful and important to bankers, realtors, prospective buyers, developers and others who deal in real estate and properties - yet, to gather all of the relevant information about a property, one might have to visit multiple sites across EPA, or to submit a FOIA request and wait to have EPA gather the data from those disparate sources. So what we did in the case of MyPropertyInfo is quickly roll out a tool that basically just gathers that existing content in one place, and additionally provide it in printer-friendly form.

Thought it was essentially just screen-scraping (as we do not directly control some of the source reporting systems), it was nonetheless a quick and effective way of getting questions answered.  Moving foward, it again demonstrates also that using approaches that can provide easily integratable content like web services in addition to traditional HTML reports, content can be even more elegantly repurposed and reused in a variety of effective ways to answer business questions - with web services associated with the reporting engines, the widgets and iPhone apps for these types of applications will virtually build themselves.  For example, real estate sites like Zillow.com would also be able to dynamically pull environmental profile information about properties of interest to prospective buyers - hopefully a vision for the future at EPA.

Here is some additional perspective on MyPropertyInfo as posted to EPA's Greenversations blog by the FOIA office's Wendy Schumacher:  http://blog.epa.gov/blog/2010/08/30/my-property-info/

National Environmental Information Exchange Network

Posted by Dave Smith On 4/11/2009 06:54:00 PM 2 comments

Here's another exiting bit of news - my firm is teamed with CGI Federal on USEPA's Software Engineering & Specialized Scientific Support (SES3) Contract, and we just got word that our team has won EPA's Central Data Exchange (CDX) task. This is very exciting news, CDX and the Exchange Network serve the community via facilitating exchange of a wide variety of environmental data between federal, state, tribal and other partnerships - it is a partnership that has proven itself to be tremendously effective and a great model for other types of data exchanges as well.

What I am particularly excited about is in leveraging the infrastructure that has already been built toward more robustly supporting geodata services, and ultimately toward enhanced reporting, metrics, analytical capabilities, and other capabilities to support feds, states, tribes and others in informed decisionmaking toward environmental policy and stewardship.

As such - we also anticipate we will be looking to grow as a company, and will be looking to hire additional technical gurus with capabilities in data exchange, data management and data flows, particularly if you have prior capabilities and knowledge of EPA's Exchange Network and CDX, and/or geospatial technology.

If you are interested, drop me a line at dsmith (at) synergist.tech.com

EPA Environmental Information Symposium

Posted by Dave Smith On 12/29/2008 04:33:00 PM 0 comments

Earlier in the month, I attended the EPA's Environmental Information Symposium - While I didn't post any updates here during the conference, I will now take the opportunity to carry over some of the more fun posts that I made to the Ning site that was set up for the Symposium:
Wordle: Web 2.0 Themes for the EPA Environmental Information Symposium
"Liberate The Data"

A Web 2.0 Success Story: Apps for Democracy

I touched on this during my presentation:


  • “Smarter, Better, Faster, Cheaper: Pick 4” – Vivek Kundra, District of Columbia CTO


  • The District of Columbia published an Open Data Catalog: GeoRSS, XML, KML and other data types


  • They then posted a contest and allowed the public to build applications, built on their Open Data Catalog


  • RESULT: In 30 days: 47 new applications for the web, facebook and mobile clients, over $2,000,000 in development at a cost of $50,000 = over 4000% ROI


  • http://www.appsfordemocracy.org/

The Web is the Platform

As a potential future paradigm, web-enabled connectivity binds together disparate resources, across EPA program offices, regions, labs, both horizontally and vertically, by transparently supporting access to data, analysis and resources:


For external stakeholders, those EPA resources then similarly become transparent, as part of the "EPA cloud" on the web, whereby the public, whether academia, industry, state or other government alike can access available resources toward supporting their own business requirements, whether watershed stewardship groups, regulated reporting industry, ecology research in academia or others:

EPA GIS Workgroup

Posted by Dave Smith On 5/12/2008 09:37:00 AM 0 comments

While everyone else is off to Where, et cetera - I'm going off to the EPA GIS Workgroup meeting in New York City...


Always great to see what's going on in terms of GIS and remote sensing for visualization, modeling and analysis in the realm of environmental protection - usually most EPA regions and program offices are represented, along with other organizations and agencies... Lots of geo friends in attendance.

Looking forward to it.

USEPA Geospatial Information Officer Selected

Posted by Dave Smith On 7/02/2007 11:04:00 AM 0 comments

From a memo circulated by USEPA CIO and AA for USEPA's Office of Environmental Information (OEI), Molly O'Neill:

I am very pleased to announce and welcome Jerry L. Johnston, Ph.D as the new Geospatial Information Officer (GIO) for EPA. Jerry has many years of geospatial experience in a variety of public and private sector positions. Most recently, he was a manager in the Environmental Fate and Effects Division in the EPA Office of Pesticide Programs where he successfully integrated geospatial analysis into the regulatory assessment processes of OPPTS. Prior to joining EPA, Jerry was in private industry where he held numerous roles including that of a Chief Technology Officer.

Over the course of his distinguished career, Jerry has demonstrated an exceptional ability to apply geospatial technology to help solve complex environmental problems. His understanding of information technology, policy development, geospatial data, and the business processes of EPA will be invaluable as we develop new approaches for managing and using environmental information.

I have mentioned to several people that I believe our geospatial work is just beginning and there are great opportunities ahead. Jerry's experience, energy, and fresh perspective will be important to leading this effort and I am confident that he will work collaboratively with the entire GIS community at EPA to make huge strides.

Jerry will be joining us full-time in mid-July. Please join me in welcoming Jerry to OEI!
Congratulations to Mr. Johnston. This will be a challenging, key role in the agency moving forward, for governance, best practices, data stewardship, enterprise architecture, SOA, OMB GeoLoB and so on.

ESRI UC Photos on Flickr

Posted by Dave Smith On 6/24/2007 09:57:00 AM 0 comments

I had a chance to get my photos from the ESRI Conference trip off of my camera, and have posted a sampling of them to Flickr...

See the slideshow...

Unfortunately it's an old camera, I didn't have it with me for some of the events and sessions, and several of my photos did not come out, but I am posting what I can.

Plenary: Analysis!

P1010012

Map Gallery: Some of my stuff:

P1010053

Cadastral Fabric:

P1010024

Marston Smith doing his electric cello thing...

P1010050

Some good friends at the EPA booth: Left to right: Ayhan Ergul (Innovate), Wendy Blake-Coleman (EPA), Claudia Benesch (CSC), Jessica Zichichi (Innovate), and Riva San Juan (Indus)... Ayhan and Jessica won first place for the Embedded GIS category in the User Application Fair for the Metadata Editor, and Jessica also won the Women's Overall in the ESRI 5k run.

P1010073

UC...

P1010080

And more... check the slideshow out...

Emergency Response and GIS

Posted by Dave Smith On 6/14/2007 07:23:00 AM 0 comments

One of the other events which will be happening concurrent to the ESRI Conference is the SONS 07 event - "Spill Of National Significance", which will replicate a major catastrophic event - to include simulated release of oil, hazardous material, and/or other associated threats to health and safety. This year's event will take place in the midwest, replicating an earthquake along the New Madrid fault zone.

Through our work with USEPA, we will be among the participants in this event, as we did for Hurricane Katrina, supporting the effort through our own GIS staff at the USEPA Emergency Operations Center. The two main stakeholders and participants on the environmental side are USEPA and USCG, along with FEMA, state, regional and private sector participants to support the response as appropriate.

It's good to see these events take place, and it no doubt will give us many more new lessons learned and opportunities to refine response. The bottom line still comes to being proactive, in terms of GIS preparedness, as opposed to reactive. With regard to availability of information on impacted facilities, we will no doubt be in better condition than we were for Katrina, but many of the other pieces are still lacking. Specifically, there is still little transparency or availability of realtime or near-realtime data, when it comes to assessing response capacity and many other pieces.

Here, we should go back to HSPD-5 which deals with communications and interoperability, and examine how well our GIS assets work together, and how well they support standards, how well-documented they are to allow users to make informed decisions regarding the data.

Closely affiliated and associated with this is HSPD-8 for preparedness - which comes along with a host of other questions - how current, scalable, flexible and robust is your GIS data, and does it address the need? For example, how many burn units are immediately available in a 100-mile radius right now, how many pieces of fire apparatus are available right now - with the right now being key. In looking at a dozen or so counties nearby, I see almost 500 fire stations, almost 200 law enforcement offices, and nearly 200 emergency medical providers. Yes, they participate in surveys and report in data, but how timely is it? Now consider that Pennsylvania has 67 counties, and the scale of the problem magnifies greatly. Say your EOC is impacted, loss of power, loss of communications, otherwise rendered inoperable. Can you cascade your operations over to a COOP site and continue seamlessly?

Here in Pennsylvania, we have many gaps, overlaps and stovepipes for data and communications flow, and many points of failure, from local to local, local to county, county to county, local to state, county to state, between state agencies, local to fed, county to fed, state to fed, and fed to fed. I can't think of any one of these which genuinely works seamlessly with the next. This is what we have been referring to as the gap of pain, and something for which we have a concept and team already up and running to address.
The February 2007 snowstorm, which caused widespread damage and notoriously left hundreds of motorists trapped on Interstate 78 for 20 hours, still remains a major fiasco here in Pennsylvania, with solutions still thoroughly unadressed politically, fiscally and bureaucratically. On the other hand, technologically, we can address the data, communications and preparedness issues. This is something that we have been looking at quite closely ever since 9/11, investing a lot of time and thought into, and something I will be running through again, considering the SONS exercise. It's time to act and become proactive, and to break through the stovepipes and fiefdoms.

USEPA GIS Workgroup

Posted by Dave Smith On 4/22/2007 04:47:00 PM 0 comments

I am looking forward to attending the EPA GIS Workgroup meeting coming up in Boston. This time around, the spring meeting will be in Boston, at the Omni Parker House Hotel, May 15th to 18th.

It will be good to refresh some contacts and make some new ones. We are hoping to make some further inroads in the EPA GIS community, combining our present USEPA geospatial expertise and SDVOSB status to reach out and support USEPA regions and other USEPA program offices. For me, it's also another opportunity to go and visit with my relatives in Massachusetts.

MetaCarta Public Sector Users Group

Posted by Dave Smith On 4/22/2007 08:28:00 AM 0 comments

MetaCarta will be having their third annual Public Sector Users Group meeting again in Tysons Corner on May 23rd.

Time and circumstances permitting, I am definitely going to try to attend again this year - Last year at their Users Group, we presented MetaCarta technology integrated with EPA mapping capabilities in EnviroMapper, specifically Window to My Environment.

It's always great to see how folks are geo-enabling and spatially mining their assets using technologies like MetaCarta. Last year, they also demonstrated quite a few other interesting emergent things from their labs - some of their innovations, such as OpenLayers and TileCache have been catching on like wildfire.

For details and registration info: http://www.metacarta.com/PublicSectorUG2007/ - and tell John Henry that I said hello...

ESRI Developer Summit?

Posted by Dave Smith On 3/22/2007 08:56:00 PM 0 comments

No, I was not able to attend... Getting the occasional bits and pieces of information from the DevSummit.

For those folks who are attending, say hi to Xiuzhu Yang, a good friend of mine and very talented developer that I have been working with for the last year on EPA projects.

Another very talented guy that I cross paths with, also at the DevSummit this year is Vincent Zhuang from SAIC, author of Programming ASP.NET for ArcGIS Server.

EPA Data and TerraIMS

Posted by Dave Smith On 3/12/2007 09:18:00 PM 0 comments

I don't always post about some of the things that we are working on, and perhaps I should do so more often.

I had some mixed feelings about recently seeing the Google Maps mashup that TerraIMS recently put up, showing EPA Superfund sites in proximity to a given address.
http://www.terraims.com/webservices/superfund.php



EPA has, for many years (in some instances, dating back to 1999), already had a number of community-oriented web mapping applications which deliver a great deal of information on EPA regulated sites, EPA cleanup activities, and so on - For example, one flagship of USEPA Office of Environmental Information is the venerable Window to My Environment - which, among other things, provides a great deal of information on EPA regulated facilities, watersheds, local and state resources, and the like at a community level. Window to My Environment currently gets well in excess of 50,000 hits per day.

Another excellent application which provides a lot of detailed information on environmental cleanup activity at a local level is the USEPA's Office of Solid Waste and Emergency Response (OSWER) Cleanups In My Community application.



These are just some of the web mapping projects that I have been involved in lately - there are actually several more EPA EnviroMapper applications available at the EnviroMapper StoreFront - and we are currently in the process of overhauling the core infrastructure of these, to migrate from legacy ASP/VBScript platforms to reusable component-based design, web services, and the ability to host some of these applications in an Oracle Portal environment, and integrating MetaCarta searches, among other things. We also have been doing some preliminary explorations in making EPA data available as KML for Google Earth, presenting some of our own Google Maps mashups (similar to TerraIMS), all of which are working wonderfully thus far. Unfortunately some of these enhancements are still in prototype, and others are only available on the EPA intranet.

So what of TerraIMS? Great work they did - but my concern isn't constrained, nor in any way specific to this particular mashup. The concern I have is with the description of the effort:

The U.S. Environmental Protection Agency recently released its National Priority list of Superfund Sites in XML format. We converted the XML data and integrated it with a database and then mashed things up a bit. Users enter an address and it is geocoded on the fly, enabling a distance query to be processed against the EPA data in the database. This mashup allows anyone to quickly and easily find the nearest Superfund Sites to their home address or a location of interest.

From this description, it sounds like they essentially have performed an extract of the EPA data, have massaged and processed it, and then do their mashup. The business case isn't tremendously compelling, as these maps and data are already long available, so it appears to be a mashup just for the sake of doing a mashup.

Fortunately Superfund data isn't tremendously volatile, however it nonetheless raises the question of other potential applications, of currency, completeness and accuracy of data utilized in mashups. Herein lies potential liability. If one uses a third-party mashup as a decision support tool, they would need to know to track back to the original source to ensure that the data they are viewing in the mashup is current, complete, and accurate...

FedUC Thursday - Enterprise Service Bus?

Posted by Dave Smith On 1/11/2007 09:18:00 PM 0 comments

Just got home from FedUC and see snow on the ground here-

I actually ran out of room in my "little blue book" with all the thoughts and notes from the conference. Very productive, all in all.

I reckon I can probably share a few more of the thoughts going through my mind...

I followed some of the Enterprise Architecture track today... SAIC gave a presentation on DHS and their notional architecture, which was interesting, and applicable to where we are and where we want to be over at EPA. Their model consists of a foundational layer of geospatial data, harvested via ETL, consumed via web services, et cetera - essentially static data. Next, an OLAP layer, of analytical and modeling tools, and finally realtime, streamed and dynamic data. These are to then plug into an enterprise service bus, for consumption by clients which can make use of the BPEL, flows and integration platform provided by ESB.

We currently need an integration framework as well - we have been pursuing a few things in deconstructing and decomposing EnviroMapper into constituent parts, aligned with functional needs, to get them ready for this type of thing, but is ESB and BPEL really ready?

Now, here, Mark Eustis from SAIC is viewing OGC as the world's "virtual service bus". Is this really true? Are OGC services really up to the challenge - and further, ready to be plugged into ESB? Some say no. Time shall tell.

In another Enterprise GIS session, an application was demoed, using ArcGIS Server and IBM WebSphere Process Server as the ESB. ESRI does have ESB in mind for AGS, however here we are still ESRI-proprietary, which doesn't look good for mix-and-match map services in a dynamic application. What about WFS-T and transactional services?

I see I have much to learn about ESB. Seems exciting, but is it really ready for primetime? Our own pursuits of an integration platform are on hold in the meantime... but that doesn't mean we shouldn't still focus on build-out of services and look at the possibilities as things continue to mature...


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

ESRI FedUC Wednesday...

Posted by Dave Smith On 1/10/2007 11:25:00 PM 0 comments

Just got back from the reception at the Organization of American States - many great conversations, and overall a very productive day for me.

Organization of American States


I sat in on a few of the Homeland Security and Emergency Response sessions - we are having an ever expanding role in that arena. Prominent was IRRIS, from our business partners and fellow Pennsylvania firm, GeoDecisions. We are currently looking at some possibilities for bringing data from mobile sensor platforms directly into IRRIS, as well as dynamic search and discovery possibilities for sensor data - will be interesting to see how it all unfolds.

Other Homeland Security / Emergency Response sessions touched on EMMA, as a good model for statewide emergency management with wide stakeholder support.

There was a good session on NIMS, as it moves forward. The pieces are rapidly falling into place for broader integration and use of incident data, the vision of local-state-federal finally becoming more of a reality. Within NIMS, they mentioned the NOAA plume modeling facility, which is continually run for known sites, utilizing current weather conditions. My wonder is in how well this is or can be tied into available facility data for all of the applicable EPA FRS sites. The NIMS document is currently being revisited for updating, presenting many new opportunities to harmonize incident management efforts across agencies.

It makes a great deal of sense to get ever more stakeholders put together, not just for simple data sharing, but also for lessons learned, sharing of SOPs, models, types of analyses performed, and so on. One report cited as an excellent resource is the 2007 National Academy of Sciences Report, Successful Response Starts with a Map: Improving Geospatial Support for Disaster Management, which cites a number of areas for improvement - for example that data standards do not yet meet emergency response needs, and that training and exercises for responders need geospatial intelligence built into them.

While we didn't have a booth at this event, I saw at least one fellow SDVOSB exhibitor, Penobscot Bay Media - they are doing work in LIDAR scanning mounted on a robotic platform. We talked VETS GWAC a bit... My friends at the EPA booth saw some very brisk business, although they didn't manage to draw Jack Dangermond in. We did catch up with Jack later (will post the picture another time, as it was not my digital camera...)

I got a few minutes to chat with Adena Schutzberg, of Directions Magazine, All Points Blog, and many other good things - she was my Teaching Assistant way back when, at Penn State for Spatial Analysis with Dr. Peter Gould. A few encouraging words from her on my blog and the many diverse things I manage to get my fingers into...

The final sessions I got to sit in on - I had a few meetings here and there, which punctuated the day, and unfortunately ended up missing the GeoLoB discussion, but caught the tail end of the Geospatial One Stop presentation by Rob Dollison of USGS. He discussed a number of things coming down the pike in the next few months, as build 2.1 gets pushed to production in the next month or so, 2.2 in March-June timeframe, followed by 2.3 - many interesting enhancements, such as search relevance, search booleans, viewing results as a bounding box, a fast base map, 3D viewing, and so on. Exciting stuff. On our end, we are looking for ways to tie our GPT instance at EPA, the GeoData Gateway, into GOS, through the EPA firewall, and using integrated security. We are lucky there to have Marten Hogeweg and the same ESRI team working with us that developed GOS.

One thing that had me wondering on the implementation was "authoritative data sources". Here, the intent would be to present "authoritative data sources" in the search results near the top. Will this be another field in the metadata record? Is it determined by virtue of its' publisher?

At any rate, it was an enjoyable day, saw many friends, had a lot of exciting and productive discussion, and I am looking forward to more of the same tomorrow. But now, I'm ready for bed...

USEPA Geospatial Information Officer

Posted by Dave Smith On 1/03/2007 01:26:00 PM 1 comments


The USEPA Geospatial Information Officer (GIO) position was posted on USAJobs last week - finally Brenda Smith's shoes will hopefully be filled. Beyond this, from all I have heard, this will indeed be a very high-level position within the agency, which should bode well for the future.

Aside from applications development, much of what I have been doing to date within EPA has been getting a handle on the Geospatial Segment Architecture, identifying alignment with FEA Geospatial Profile and GeoLoB, and other projects, to lay the groundwork for leveraging and harmonizing all of the GIS-related efforts within EPA. This might be a perfect job for me - however, I am thinking I might like to stay in the private sector...

To Savannah, Georgia...

Posted by Dave Smith On 11/20/2006 08:07:00 PM 0 comments

Gratefully, things are finally starting to sort themselves out on the move...

Currently I am gearing up to head south to the Savannah International Trade and Convention Center in Georgia, to attend the EPA Environmental Information Symposium from December 5th to December 7th. This afternoon, I just finished putting together a PowerPoint for a brief talk on ingestion, analysis and mapping of unstructured data during one of the sessions.

We are also hoping to put on a live demo - using MetaCarta web services integrated with ArcIMS and ASP-based Window to My Environment - we are also currently in the midst of a redesign for the infrastructure of Window to My Environment and EnviroMapper applications, to look at good stuff like integrating the application into the Oracle Portal environment, ESRI Geospatial Portal Toolkit, Service-Oriented Architecture, and Reusable Components.

Along the MetaCarta lines, there are a few other interesting tools emerging, with potential for geotagging - Aerotext, which I discussed previously, and SRA's NetOwl, developed for the CIA - which I may get an opportunity to see next week.

I had a great time in Las Vegas last time, so I am looking forward to this event as well...

SAFE joins the GeoBlogging scene

Posted by Dave Smith On 11/15/2006 10:03:00 PM 0 comments

The folks at Safe Software have joined the GeoBlogging scene - They sent out the following:

Announcing Safe Software's Blog
We are pleased to inform
you that Safe Software now maintains a weblog (blog) at
spatial-etl.blogspot.com.

We plan to use our blog to:

  • keep you informed by posting breaking news about product updates
  • share our insights into developments in our rapidly-evolving industry
  • post information on upcoming Safe Software eventsdirect you to podcasts and
    articles prepared by Safe staff
  • point to slide presentations from conferences and workshops
  • surprise you every so often with a light-hearted posting, just to keep
    things lively.

We hope you'll visit our blog, and check back often.

They have been a good group of people to work with - they were very responsive and knowledgeable for us during Katrina for some of our ETL needs, particularly their wizard Juan Chu Chow.

We also hope to be working with them again on one of our EPA projects in the near future - after all, while there is no "true" ETL for spatial data yet, but.... the Safe folks have thus far come much closer than anyone, hands down.

ESRI UC

Posted by Dave Smith On 8/11/2006 06:31:00 PM 0 comments

Still unwinding from the ESRI User Conference - I had meant to post during the conference but there just wasn't enough time, from the daily activities and parties after.

Now, I still have a few hours to kill in San Diego before my flight, and am finally taking it easy. The Conference was a lot of fun - Though being in that mode of hundreds of intense 5-minute conversations over the course of the day can tend to put one in an odd state of mind - by the end of that day, you can scarcely remember who you talked to, or about what... And too much going on, at a frenetic pace. I had to step out of quite a few sessions to support a few ongoing things with our federal customers, so it was definitely a working vacation...

At any rate, a lot of old friends, colleagues and adversaries at USDA, EPA, other agencies, as well as CSC, Perot Systems, Lockheed, Jim Knudson, my MetaCarta buddies, Chris Cappelli from ESRI and others from PA and elsewhere, as well as many excellent and interesting new contacts made, and as always a lot of excellent intel and takeaways - I did get a brief chance to talk to keynote speaker, Senator Bob Kerrey, now of the New School such as Kim Ollivier, who is the father of the New Zealand national Transverse Mercator grid, and some who I've known for years but never actually met, like Mike Binge - he and I have been on the same side, preaching the Surveying and GIS message for years, and often diametric opposites, busting each others' chops on political issues...

The biggest thing for me - some of my main customers were there... And though there's plenty of excellent showcasing and eyepopping demos, sometimes one has to put the brakes on and use a cautious and critical eye. We are currently in the midst of going from just ArcIMS in our shop to Geospatial Portal Toolkit and ArcGIS Server for Web development, (yay, great new toys!), but there are still some questions and concerns that linger, with regard to placing too much dependency on some of the more granular out-of-the-box ESRI pieces that these ship with, for map viewers, for instance - with developer ADFs not yet completely standardized and harmonized, not all versions of map viewers standardized, and the very distinct possibility in some instances of an ESRI update rollout breaking our apps where we have plugged them into our customizations.

The good news is that the new architecture I have been developing decouples components anyways, so I will be more able to plug in some robust ESRI pieces rather than reinvent the wheel, and this strategy also reduces potential breakage and/or makes the breaks much simpler to fix, and the second half-full glass is that a more standardized and harmonized ADF is coming coming, and not necessarily terribly far off. I think the ESRI team has come quite a ways.

So... on to ArcGIS Server...


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

ESRI User Conference...

Posted by Dave Smith On 8/05/2006 03:31:00 PM 0 comments

Tomorrow morning I'm boarding my flight to head out to San Diego for the 2006 ESRI User Conference - I am looking forward to getting away for a few days, and am looking forward to the conference. Quite a few friends and colleagues will be there once again... It will probably mostly be EPA-centric for me (you may find me helping out at their booth at several points), as a majority of my work is still focused on their projects. 85 people representing EPA, from program offices, to regions, to some of their contractors like CSC will be attending.

It's been a few years since I last went... I usually end up swamped with too many things on my plate, or with other competing events going on. So this time I end up hitting both FedUC and the regular UC in the same year. I am also expecting to see several of my other friends from MetaCarta and other companies and organizations there - and am also hoping to make some new friends and contacts.


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Measuring Up: The Business Case for GIS

Posted by Dave Smith On 6/01/2006 09:48:00 PM 0 comments

I finally got a chance to finish reading the last group of case studies in Measuring Up: The Business Case for GIS. I had started reading the book a few months ago, while embroiled in some of my current GIS Architecture efforts - while I enjoyed the case studies, I was at the time looking for some good meat for building up business case documentation, to also include regulatory compliance, discussion and analysis of requirements and use cases, cost-benefit analysis, and, in general, a lot more technical material and discussion of business case approach, development and documentation.

Unfortunately this book does not provide much depth in those areas. What the book does provide, however, is a robust and broad collection of case studies, much like the ESRI Map Books or several of Winnie Tang's books.

Back in February or so, the book came recommended to me by EPA GIO Brenda Smith - and indeed it is a great idea book, to illustrate by example how business cases can be met by GIS.

The book presents about 75 case studies, arranged topically as follows:

  • Save money/cost avoidance
  • Save time
  • Increase efficiency
  • Increase accuracy
  • Increase productivity
  • Increase communication and collaboration
  • Generate revenue
  • Support decision making
  • Aid budgeting
  • Automate work flow
  • Build and information base
  • Manage resources
  • Improve access to Government
  • Enterprise GIS

The authors, Christopher Thomas and Milton Ospina additionally ensured they also had a good mix of industries, such as Business, Government, Natural Resources, Transportation and Utilities.

In general, I have typically seen the two key drivers to be savings / cost avoidance and legislative or other regulatory mandate. Beneath these, some of the others, such as automation of workflow, increased efficiency and productivity, increased accuracy typically fall, as a means to the higher ends. In planning Enterprise GIS implementations, however ALL should be considered, and then plugged in wherever they may support direct drivers, as aforementioned.

Since quickly perusing the book and reading many of the case studies back in February, I am glad I was able to slow down enough to pick it up again and look through it in detail again. However, at some point it would be great to see some good material available on how to thoroughly develop the business case, in terms of cost-benefit analysis, use case analysis, requirements analysis, and the like.

For anyone else who may find it of use, I pass along the details:

Measuring Up: The Business Case for GIS
Christopher Thomas and Milton Ospina
Paperback: 200 pages
Publisher: Esri Press (September 28, 2004)
Language: English
ISBN: 1589480880


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , ,

Search