Surveying, Mapping and GIS

Exploring all aspects of mapping and geography, from field data collection, to mapping and analysis, to integration, applications development and enterprise architecture...

  • Geospatial Technology, End to End...

    Exploring all aspects of mapping and geography, from field data collection, to mapping and analysis, to integration, applications development, enterprise architecture and policy
Showing posts with label OGC. Show all posts
Showing posts with label OGC. Show all posts

Building a Headless Linux GeoServer Box

Posted by Dave Smith On 8/30/2009 08:40:00 AM 17 comments

I recently inherited some older machines and, to support some ongoing in-house experimentation I've been involved in, set them up as quick-and-dirty servers to help serve up geospatial data services - the approach I took was to build what are essentially minimal machines running linux in command-line mode, and then load GeoServer on them to serve the data - As I haven't blogged in a while, a friend suggested that posting a quick description of the mechanics of this might be a good thing to share for folks who haven't dipped their toes into Linux much.

As a disclaimer, I do not claim to profess guruhood when it comes to Linux or the other packages, this is not necessarily warranted to be a "hardened-and-tweaked" system for production, it's just some very quick and dirty steps toward standing up a headless Linux-based GeoServer instance. Note that this uses the default Jetty install - some folks prefer to run it under Tomcat, which is a different path.

So, I started out with the "minimal install CD" for Ubuntu 9.04, available here:

https://help.ubuntu.com/community/Installation/MinimalCD



Select a package appropriate for the CPU you are using - in my case, I chose Ubuntu 9.04 for 32-bit PC.

Burn the ISO and follow the prompts to install from the text-based installer as command-line interface (CLI). I essentially went with the defaults. You will want to have the machine connected to the internet so that it can identify and set up the network connection and grab any files needed during install.

Once you've installed a minimal version of Linux, you will be ready to configure and install the other goodies.


For remote administration, you may want to install OpenSSH- http://www.openssh.com/



The step for doing this is simple:

Log in to your Linux machine, and use the following command:

sudo apt-get install openssh-server

This will download and install the OpenSSH package. For folks new to Linux, sudo tells it to use superuser privileges and permissions, and will ask for the root password used when you installed Linux. apt-get install uses the Advanced Package Tool to search for, retrieve and install software packages for Linux - this makes installation of much standard software in Linux easy.

For remote administration, you'll want to know how to reach your machine on the network - you can get the IP address by using the ifconfig command, which will give results something like this:



If you use Windows as a primary OS for your other work, you can then access the box from a Windows machine using an SSH client. I usually use PuTTY: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html



From there, you can install PuTTY on your windows machine and then access the Linux box via command-line interface remotely for administration.

Plug in the IP address you got above:



and voila - you should be presented with a login screen for your linux box:



Tools like PuTTY are a great asset when it comes to administering boxes.

Side trip into remote administration aside, on to the REAL stuff: Installing GeoServer.

As a prerequisite, you will need to install the Java JDK - the GeoServer install page gives some recommendations, and here's how you would do it from the command line:

sudo apt-get install sun-java6-jdk

Next, you will need to do some configuration of the JDK

Define the default Java to use:
sudo update-java-alternatives -s java-6-sun

And set the JAVA_HOME directory - this is doable in a number of ways, you may or may not want to define it in /etc/environment. I really like 'nano' as an editor for command-line Linux environments and it comes pre-installed in the minimal Ubuntu 9.04 version.

sudo nano /etc/environment

Again 'sudo' makes sure you have an appropriate privilege level to write the changes.

In nano, you can navigate around in the file using your arrow keys. Insert the following:

JAVA_HOME=/usr/lib/jvm/java-6-sun-1.6.0.14/


nano is intuitive and easy to use, following the commands along the bottom of the screen, e.g. ctrl-O to write changes, ctrl-X to exit.

Now on to the fun stuff - installing GeoServer.

GeoServer isn't available via apt - so you will need to download and unzip it to install it.

To be able to use ZIP archives, sudo apt-get install unzip will provide that capability. Next, you can download GeoServer.

Decide where you want to put it - some folks put it in /usr/local, or /usr/share, or if you are just experimenting, you could even leave it in your home directory - if putting it /usr/share in you would cd /usr/share

To download it, the download location given on the GeoServer page is http://downloads.sourceforge.net/geoserver/geoserver-1.7.6-bin.zip

Thus, to download it, use wget -

sudo wget http://downloads.sourceforge.net/geoserver/geoserver-1.7.6-bin.zip

Then, unzip it

sudo unzip geoserver-1.7.6-bin.zip
and you should see the files extracting into a geoserver-1.7.6 folder.

Depending on where you put it and privileges held by the account you are using, you may also need to ensure you have ability to access and run GeoServer and that GeoServer can create any files it needs.

chown will change ownership, using -R makes it recursive through subfolders and files:

sudo chown -R geoserver_username geoserver-1.7.6 would change all files and directories to be owned by the user specified (geoserver_username as a placeholder).

You can list files using ls and navigate directories using cd.

You may or may not also then want to configure directories, such as defining the location of your GeoServer installation directory, e.g. GEOSERVER_HOME="/usr/share/geoserver-1.7.6" - again, you could do this using nano to edit /etc/environment - and there are also plenty of other ways to do this. You could also define other parts of GeoServer, such as GEOSERVER_DATA_DIR at this point as well - consult the GeoServer docs for details there... http://docs.geoserver.org/1.7.x/en/user/

Pretty much ready to run now... cd to the /bin directory under your geoserver install, e.g. cd /usr/share/geoserver-1.7.6/bin" and launch the startup script sh startup.sh and voila... You will see some program output scroll by,
ultimately resulting with an output line like

[main] INFO org.mortbay.log - Started SelectChannelConnector@0.0.0.0:8080 - this should tell you that the GeoServer Jetty container is up and listening for connections on 8080.

Now, open a browser, point it to your machine's IP address and enter it, pointing to port 8080 and the geoserver instance, e.g. http://192.168.2.125:8080/geoserver/ and after an initial "loading" screen you should get the GeoServer web interface:



And you are off to the races... Confirm that it works via the demos:

OpenLayers NYC Tiger map

Again, this is just meant to be a quick-and-dirty guide - enough to make even someone with minimal Linux experience armed and dangerous - and from here, there are many tweaks and customizations that can be made, such as optimizing performance, hardening and security and so on (there are plenty of discussions around the web and on listservs regarding this)- but I figured, I'd at least share this as a quick start for anyone looking to play with GeoServer in a minimal Linux environment...

Fun With Virtual Earth

Posted by Dave Smith On 7/15/2007 07:39:00 PM 2 comments

It's been a very busy few weeks, so I haven't had much chance to post... things going on with Virtual Earth, ArcIMS Route Server and GDT/TeleAtlas, survey-grade GPS data collection, emergency response, logistics, modeling and simulation, and plenty of other things flying around.

It's all fun stuff, but I always enjoy rolling the sleeves up and getting dirty... doing the AJAX thing, mashing various web services with Virtual Earth's V5 API via pure JavaScript clients as various proof-of-concept applications.

Here's one quick app: Virtual Earth and the NASA MODIS WMS server:

Here I'm showing the VE and MODIS side-by-side, both views refresh dynamically. The next step will be to explore the "Roll Your Own Tile Server" approach to seamlessly getting custom WMS content directly into VE.

The next one was even more fun: Virtual Earth and the MetaCarta JSON API:All pure Web 2.0, neogeo, slippy AJAXIAN goodness, "Look, Ma! No "SUBMIT" button!!" Type in your query, (searching for documents about toxic substances here...) and it fetches results from the MetaCarta appliance and sprays them back into the VE map view. Pan, zoom, and instantly you get new stuff popping up. All self-contained in a few k of DHTML, CSS and JavaScript, no Java, .NET, Ruby, Python or other infrastructure needed. Both written in a total of about 2.5 hours, just noodling around in the APIs without any real thought ahead of time.

Vector Data, SOA and Scalability

Posted by Dave Smith On 6/26/2007 04:35:00 PM 3 comments

One of the things I am still trying to get my head around is scalability in vector-based web services, such as OGC Web Feature Services or ArcIMS Feature Services. Certainly with image services, one can do a lot of magic behind the scenes - such as tiling, caching, load balancing.

In many instances, an image service will suffice well, but for power users, for ad-hoc queries and analysis, the full geometry and attribute data is often needed. And in the case of a distributed enterprise, here is one place where a purely SOA-oriented approach begins to break down.

Things becomes a bit more difficult when it comes to vector geometry, as most GIS clients are still only geared toward consuming and processing vector data in one chunk.

Further, vector geometries can't well be broken into tiles without causing other breakage - polygons and linear features need to retain their topological integrity in order to work.

Yes, one can certainly cache vector feature services, provided the underlying data is relatively static, or apply constraints limiting the amount of data that one can fetch at a time, but is there any possibility, looking down the road, of utilizing more efficient serial or multiple parallel processes to rapidly and efficiently stream large and complex vector datasets?

I think there will need to be, and I don't yet see OGC, ESRI or anyone else looking at this. I'd be interested in hearing other folks' thoughts and experiences on this...

W3C, REST vs. SOAP, and where are we headed?

Posted by Dave Smith On 2/22/2007 04:28:00 PM 2 comments

Looks like some major battles are heating up... There is an upcoming W3C workshop: Workshop on Web of Services for Enterprise - and it is stirring up quite a few emotions and arguments and throwing them out onto the table for debate.

Is it to be a battle or reconciliation between REST and SOAP/WS-*?

Here's one take on it, from Jérôme Louvel: Will we reconcile REST, WS-* and SOA?

Or is the bigger question, what W3C's role should even be? Nick Gall of Gartner put a few pointed, incendiary statements into his position statement, which are drawing quite a bit of attention:


It is my position that the W3C should extricate itself from further direct work on SOAP, WDSL, or any other WS-* specifications and redirect its resources into evangelizing and standardizing identifiers, formats, and protocols that exemplify Web architectural principles. This includes educating enterprise application architects how to design "applications" that are "native" web applications.

It appeared that there are also a few drawing a line between "World Wide" and "Enterprise"... Or suggesting that W3C should abandon aspects of their pursuits and leave it up to industry.

Certainly the geospatial community will need to sit up and take note of where this leads... Having written my own WFS and WMS services and clients from scratch, though I am still no Web Services guru, I still had to wonder about the wisdom of the approaches being used in the whole paradigm. Are the current OGC standards really in line technologically with the vision that was originally expressed by Tim Berners-Lee? I think not. Are they compatible with some of the security and other needs of the community? I think not. Not the end of the world, but certainly much iterative refinement to get us where we ultimately need to be.

Stefan Tilkov at InfoQ provides a good roundup of many of the position papers. Will we see some simplification and resolution to the ever-emergent convoluted forest of Web Services and Standards?



A late picture from ESRI FedUC 2007

Posted by Dave Smith On 2/02/2007 11:14:00 AM 0 comments

I was delinquent in posting this pic from the 2007 ESRI FedUC...

DSC_0032

A group of colleagues from the USEPA contracting world - From left to right: Myself, Jessica Zichichi (Innovate!), Jack Dangermond, Claudia Benesch (CSC - Agency Central Support) and Catherine Harness (CSC - GeoData Gateway Lead)

This was the reception at the Organization of American States - and yes, that is a map on my necktie...



Technorati tags:, , , , , , , , , , , , , , , , , , , , , , , ,

More from FedUC - What did Jack Dangermond Let Slip?

Posted by Dave Smith On 1/12/2007 02:43:00 PM 2 comments

A provocative title for the post...

The FedUC conference wrapped up with an excellent lunch (aside from the staple in my veggies)... and a great presentation from CW4 Michael Harper of the USACE Topographic Engineering Center on Buckeye, DAGR and other interesting things going on there... may merit a separate post.

The good Doctor provided, as he so often does, a great Q&A session as a wrap-up. Some of the bullets from that discussion


  • East coast tech support is coming - ESRI is standing up a tech-support team in North Carolina, to offer expanded AM hours, beginning at 6AM. Additionally, they may investigate the possibility of better search capabilities in the online tech support material, via Google.
  • ESRI continues to work with the University of Redlands, with the 1-year Masters program - potentially to be expanded to similar programs at George Mason for those of us working in the DC area. The DC area is a hotbed of geospatial activity, with good talent scarce.
  • Mr. Dangermond described some interesting things he has going on internally, to provide real-time financial reporting on the state of ESRI - which, no surprise - are geo-enabled. Being able to get this kind of reporting is an area of interest which I have no doubt will grow in other firms with a broad geographic reach.
  • In conjunction with FedUC was the first Classified ESRI GIS community meeting, attended by 300 or so - featuring discussion of applications in the TS/SCI NOFORN arenas. Typically the pitfall is stovepipes, but being able to get cleared personnel together to discuss topics of common interest will be a boon to the DoD/Intel community.
  • ESRI is trying to bolster its Java support, with their Java team growing and building better support for Java classes.
  • The Geography Network is to be overhauled, and replaced with a more robust, more collaborative version.
  • 9.3... Saving goodies for last. The 9.3 beta is scheduled for sometime this summer, probably around the Conference. Some improvements in the hopper for 9.3 include interoperability and OGC, improvements to ArcGIS Server, Mobile Applications development support, and addressing stability and known issues with 9.2. They are also looking at PostgreSQL.
  • A next iteration is probably a year to year and a half away yet. Here, the focus will be simplification of the user interface, support for multiple views and multiple documents, and capability of storing all types of geofiles in a geodatabase - such as metadata catalogs, and so on.


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , ,

FedUC Thursday - Enterprise Service Bus?

Posted by Dave Smith On 1/11/2007 09:18:00 PM 0 comments

Just got home from FedUC and see snow on the ground here-

I actually ran out of room in my "little blue book" with all the thoughts and notes from the conference. Very productive, all in all.

I reckon I can probably share a few more of the thoughts going through my mind...

I followed some of the Enterprise Architecture track today... SAIC gave a presentation on DHS and their notional architecture, which was interesting, and applicable to where we are and where we want to be over at EPA. Their model consists of a foundational layer of geospatial data, harvested via ETL, consumed via web services, et cetera - essentially static data. Next, an OLAP layer, of analytical and modeling tools, and finally realtime, streamed and dynamic data. These are to then plug into an enterprise service bus, for consumption by clients which can make use of the BPEL, flows and integration platform provided by ESB.

We currently need an integration framework as well - we have been pursuing a few things in deconstructing and decomposing EnviroMapper into constituent parts, aligned with functional needs, to get them ready for this type of thing, but is ESB and BPEL really ready?

Now, here, Mark Eustis from SAIC is viewing OGC as the world's "virtual service bus". Is this really true? Are OGC services really up to the challenge - and further, ready to be plugged into ESB? Some say no. Time shall tell.

In another Enterprise GIS session, an application was demoed, using ArcGIS Server and IBM WebSphere Process Server as the ESB. ESRI does have ESB in mind for AGS, however here we are still ESRI-proprietary, which doesn't look good for mix-and-match map services in a dynamic application. What about WFS-T and transactional services?

I see I have much to learn about ESB. Seems exciting, but is it really ready for primetime? Our own pursuits of an integration platform are on hold in the meantime... but that doesn't mean we shouldn't still focus on build-out of services and look at the possibilities as things continue to mature...


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

AJAX, ASPMap and WMS

Posted by Dave Smith On 11/05/2006 11:38:00 PM 2 comments

Though I have been playing with code for close to 30 of my 40 years, I have always considered myself far more of a hack than any kind of disciplined programmer with a formal background, as my focus has always been one of the science and engineering professional, oriented toward solving specific domain issues, such as surveying, hydrology, environmental issues, transportation, and the like - and most recently I have been thrust into Enterprise Architecture mainly due to my understanding of both the business and IT side of geobusiness - yet in terms of coding I have always managed to get some quick and often unique results in my forays into the programming world, and I am once again quite pleased tonight. One of the things I was ruminating on a few days ago is the possibility of using ASPMap as a web service, and calling the service via JavaScript and populating the view via AJAX. Better yet, extending ASPMap to serve WMS requests, but I am getting ahead of myself...

Yesterday I sat down and managed to build a web service around ASPMap (MS Visual Studio 2005 ASP.NET and .NET Framework 2.0), and as its primary functionality, it currently accepts bounding box and image size as parameters - layers and other settings will be forthcoming. Essentially the service renders the image and then dumps it out as a file - I used a milliseconds function in JavaScript to provide the unique identifier (no cookies, no viewstate or other serverside issues - all these are managed by JavaScript in the browser)

I then tried Matteo Cassati's Soap Client for JavaScript and then used the DOM to insert the image (with the known URL) into the view. I still want to better understand what I can and can't do with the return types, but that will come...

I then built some JavaScript functions to pan, zoom, et cetera, as well as a one to resize the image (with a new call) whenever the browser is resized (note the views here are full-screen, although with the banner blurred out to protect potential customer). The map view will be dynamic, trying to maximize screen real-estate. I also want to do some nice AJAX magic with collapsing/resizable map controls and query panels (the navigation/toolbar shown in the upper right is already enabled in this fashion)...

The next thought actually ties back to my original thought in a way... Having already built all the infrastructure, what would be involved in hitting an OGC WMS service and bringing in the image? Not much at all, just a little magic trying to navigate how JavaScript deals with ampersands in constructing strings. An hour and a half tonight, and my WMS client is also done - you see here a classic image from NASA JPL (MODIS Mosaic) - there is still much more tweaking and functionality to be developed, but for a mere weekend's work for a hack like me, I am quite pleased with the results.

Considering I managed to fit all of this in during a hectic weekend, in the midst of preparing for a move (with painting and carpeting the new house, boxing things up at the old house, et cetera) and driving 4 hours to Washington, DC tonight for meetings tomorrow, I am frankly amazed that I managed to get this done so quickly.

I am very much looking forward to what the next few months will bring. My intent is to build some very robust and reusable infrastructure. What I have started here will become universal throughout future applications, and the beauty of it is that I will be able to snap in whatever mapping infrastructure is available, whether ArcIMS or ArcGIS Server, ASPMap, WMS Servers, or anything else I can either build a Web Service wrapper around or convert to WMS outright.

Geo Web Services and Reliability

Posted by Dave Smith On 10/01/2006 11:40:00 AM 0 comments

In a recent blog post, Matthew Perry (PerryGeo) lashed out, reaffirming his reluctance to rely on spatial web services such as WMS. Given that the NASA JPL WMS server was evidently down, many users and applications which consume the JPL WMS services also had issues and outages.

Yes, certainly any distributed system can face issues with downtime, or worse yet, with services going away altogether. How to remedy this?

As a service provider, one of the things that providers can work toward is providing the users with some information with which the users can make informed decisions. Essentially, manage their expectations. In some arenas, particularly where commercial services are offered, this should be in the form of a formalized Service Level Agreement (SLA), as part of the contract.

Here, the provider can outline the parameters of the content itself (such as FGDC-compliant metadata, including accuracy, source, et cetera), and beyond each iterative data snapshot, provide information on data latency (how current is the data?) and how often will it be refreshed? From an infrastructure standpoint, some hosting parameters should also be set forth, such as availability and reliability - to set the ground rules on what kind of downtime is acceptable or unacceptable, what kind of data throughput can be expected, and so on.

This becomes a bit more of a challenge in other arenas, such as agencies where external budgetary, legislative and other pressures may impact how well an organization can support published services.

But nonetheless, if an agency is forthcoming and up-front with these constraints at the outset, it can allow its' users and stakeholders to plan and prepare accordingly - by writing their code to handle outages appropriately, by utilizing alternative services, and so on. It also can lead toward consolidation, augmentation, replication of valued services as external users and stakeholders lobby for support (via legislative initiative and budget) for these services. An agency might not be able to provide formalized parameters as a commercial service would be expected to, however providing users with some basic insight will go a long way.

As organizations gradually move more and more toward Service Oriented Architecture, and as variegated Web Services appear right and left, and increasingly become available lights-out, via a simple URL, it becomes increasingly easy to connect to and consume these services - however that ease of access and use is ultimately bound to cause some disappointment, when things change down the road. I suggest and recommend SLAs and service metadata toward managing expectations.



Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

ESRI UC

Posted by Dave Smith On 8/11/2006 06:31:00 PM 0 comments

Still unwinding from the ESRI User Conference - I had meant to post during the conference but there just wasn't enough time, from the daily activities and parties after.

Now, I still have a few hours to kill in San Diego before my flight, and am finally taking it easy. The Conference was a lot of fun - Though being in that mode of hundreds of intense 5-minute conversations over the course of the day can tend to put one in an odd state of mind - by the end of that day, you can scarcely remember who you talked to, or about what... And too much going on, at a frenetic pace. I had to step out of quite a few sessions to support a few ongoing things with our federal customers, so it was definitely a working vacation...

At any rate, a lot of old friends, colleagues and adversaries at USDA, EPA, other agencies, as well as CSC, Perot Systems, Lockheed, Jim Knudson, my MetaCarta buddies, Chris Cappelli from ESRI and others from PA and elsewhere, as well as many excellent and interesting new contacts made, and as always a lot of excellent intel and takeaways - I did get a brief chance to talk to keynote speaker, Senator Bob Kerrey, now of the New School such as Kim Ollivier, who is the father of the New Zealand national Transverse Mercator grid, and some who I've known for years but never actually met, like Mike Binge - he and I have been on the same side, preaching the Surveying and GIS message for years, and often diametric opposites, busting each others' chops on political issues...

The biggest thing for me - some of my main customers were there... And though there's plenty of excellent showcasing and eyepopping demos, sometimes one has to put the brakes on and use a cautious and critical eye. We are currently in the midst of going from just ArcIMS in our shop to Geospatial Portal Toolkit and ArcGIS Server for Web development, (yay, great new toys!), but there are still some questions and concerns that linger, with regard to placing too much dependency on some of the more granular out-of-the-box ESRI pieces that these ship with, for map viewers, for instance - with developer ADFs not yet completely standardized and harmonized, not all versions of map viewers standardized, and the very distinct possibility in some instances of an ESRI update rollout breaking our apps where we have plugged them into our customizations.

The good news is that the new architecture I have been developing decouples components anyways, so I will be more able to plug in some robust ESRI pieces rather than reinvent the wheel, and this strategy also reduces potential breakage and/or makes the breaks much simpler to fix, and the second half-full glass is that a more standardized and harmonized ADF is coming coming, and not necessarily terribly far off. I think the ESRI team has come quite a ways.

So... on to ArcGIS Server...


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

ESRI User Conference...

Posted by Dave Smith On 8/05/2006 03:31:00 PM 0 comments

Tomorrow morning I'm boarding my flight to head out to San Diego for the 2006 ESRI User Conference - I am looking forward to getting away for a few days, and am looking forward to the conference. Quite a few friends and colleagues will be there once again... It will probably mostly be EPA-centric for me (you may find me helping out at their booth at several points), as a majority of my work is still focused on their projects. 85 people representing EPA, from program offices, to regions, to some of their contractors like CSC will be attending.

It's been a few years since I last went... I usually end up swamped with too many things on my plate, or with other competing events going on. So this time I end up hitting both FedUC and the regular UC in the same year. I am also expecting to see several of my other friends from MetaCarta and other companies and organizations there - and am also hoping to make some new friends and contacts.


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

DHS Mandates Information Sharing

Posted by Dave Smith On 7/23/2006 11:23:00 PM 1 comments

I have been scrutinizing the current 2007 Department of Homeland Security Authorization Bill (HR 5814)... Of particular interest are several sections under Title V, which mandates information sharing - specifically mentioned are State, Local, Tribal and Regional partners, along with a data fusion concept.

(a) Establishment- The Secretary shall establish a State, Local, and Tribal Information Fusion Center Initiative to establish partnerships with State, local, tribal, and regional information fusion centers.
(b) Duties- Through the State, Local, Tribal, and Regional Information Fusion Center Initiative, the Secretary shall--
(1) coordinate with the principal official of each State, local, tribal, or regional information fusion center and the official designated as the Homeland Security Advisor of the State;
(2) provide Department operational and intelligence advice and assistance to State, local, tribal, and regional information fusion centers;
(3) support efforts to include State, local, tribal, and regional information fusion centers into efforts to establish an information sharing environment (as defined under section 1016(2) of the Intelligence Reform and Terrorism Prevention Act of 2004 (Public Law 108-458; 118 Stat. 3665));
(4) conduct table-top and live training exercises to regularly assess the capability of individual and regional networks of State, local, tribal, and regional information fusion centers to integrate the efforts of such networks with the efforts of the
Department;
(5) coordinate with other relevant Federal entities engaged in homeland security-related activities;
(6) provide analytic and reporting advice and assistance to State, local, tribal, and regional information fusion centers;
(7) review homeland security information gathered by State, local, tribal, and regional information fusion centers and incorporate relevant information with homeland security information of the Department;
(8) Provide management assistance to State, local, tribal, and regional information fusion centers;
(9) Serve as a point of contact to ensure the dissemination of relevant homeland security information.
(10) facilitate close communication and coordination between State, local, tribal, and regional information fusion centers and the Department;
(11) provide State, local, tribal, and regional information fusion centers with expertise on Department resources and operations;
(12) provide training to State, local, tribal, and regional information fusion centers and encourage such information fusion centers to participate in terrorist threat-related exercises conducted by the Department; and
(13) carry out such other duties as the Secretary determines are appropriate.

Music to my ears, anyways... I have been preaching sharing and fusion as the only workable longterm solution for over a decade. However, how well it will actually work out is yet to be seen.


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , ,

Geospatial Line of Business RFI Submitted

Posted by Dave Smith On 5/05/2006 10:17:00 PM 0 comments

It's been a very busy week.... many back to back meetings, and a lot of fast and furious typing, multitasking, teleconferencing and a few brainstorming sessions which have left me drained.

Among several other things that went out the door, we did submit a private-sector response to the Geospatial Line of Business RFI today. My public-sector counterparts have been quite harried as well. They have also been ordered by OMB to collect a variety of data on their geospatial business. Agency-wide data calls are being made to collect up data by the 15th.

I have also heard that some key agencies were alienated and somewhat uncooperative due to the OMB "bull in a china shop" approach. However, the public face and cooler heads have, for the most part, prevailed.




There are some indications that the Geospatial Line of Business may yet be a rocky place where consolidation will find limited purchase. One puzzlement/worriment is that even as the RFI responses have been submitted and OMB is looking to agencies to compile information on their geospatial assets and business, still no answers to the questions submitted at the Industry Days session or in the Q/A period subsequent have been posted on the OMB website. There are also some other questions about the legality of the approach...


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , ,

Geospatial Line of Business Debrief

Posted by Dave Smith On 4/19/2006 10:02:00 PM 2 comments

A few people have been asking me for a debrief on the Geospatial Line of Business meeting yesterday at the JW Marriott in DC. Mixed feelings on the event. With regard to Agency people, there were a few of the regular movers and shakers, and a considerable number of Agency geospatial figures completely absent - and of the ones I spoke to, I got a mixed response. Some were unconcerned and coolly interested, some very nervous, others very skeptical. There was an odd industry turnout - several of the key players that I often work with in the rarified air were there, but others conspicuously absent. Their response was similarly mixed. Sure, lots of "what's in it for us?" But many doubts and trepidations.

My own key concern is that Geospatial business is horizontal, as opposed to vertical. I also had some not insubstantial doubt that the task at hand and its full ramifications was fully grasped by those responsible for this endeavor.

With regard to the meeting itself, I unfortunately missed part of the first section - I unfortunately got in a minor collision on I-270 on the way down. Two cars behind, a driver failed to notice the slowdown in morning rush hour traffic and rammed an Acura RSX into the backend of my workhorse Subaru. Fortunately nobody hurt, I was able to get cool heads to prevail over the definite rage potential, and everyone was able to get the information they needed to deal with this another way, on another day.

Fortunately I didn't miss much of the meeting. The initial presentation mainly was a rehash of the PowerPoint that I presented in my last post. This was followed by Q&A session. There were many questions, but only sketchy answers. There were several questions on what the resultant direction would be - a samping:

  • Whether a Center of Excellence approach would be taken - the answer to this one was that they had looked at this, and it held some good low-hanging fruit, but that it was not entirely satisfactory and wouldn't strictly be emulated
  • What kind of approach or form of acquisition would be taken, such as a specific contract vehicle - no decision yet
  • Minority business participation, et cetera - will be a factor
  • Sponsorship - sustainability would be key

Ultimately they were trying to stress goals of business process re-engineering, standardization, sharing, and culture change. Lofty goals. But little in terms of approach - which is the purpose of the RFI. Interoperability was indicated to be a key driver.

There were also some questions submitted, but unanswered -

  • What exactly do you mean by "geospatial" - traditional GIS or anything with a geo-component, such as address databases, lat/long values in a database, or locational references...?

They did recognize the need for many areas of Agency geobusiness to continue in place even with consolidation. They also indicated that the results will show up in the President's Budget submission for 2008 - keep your eyes peeled next February.

They do promise to post the questions and answers online. The formal Q/A period goes until 4/21/06. Following this, read the RFI closely, read closely the Vision, Goals, and Objectives section, and provide a best response. There will be limited ability to assess a huge volume of responses, so if possible try to consolidate responses with others, such as helping the agencies, industry/nonprofit organizations or joint responses by the major private sector players. I had already intended to respond and have already been contacted by a couple of key people... I may have parts of my response feeding into others' response.

They are providing a template and some guidance - they are allowing an additional 20 pages to be submitted, to bring the total to 30 pages, not including cover and TOC. Submittal of current best practices is encouraged, with Agency approval.

The current point for dissemination of information is here: http://www.whitehouse.gov/omb/egov/c-6-8-glob.html

Whether or not this comes to fruition, this is definitely still a very big deal... And it IS most definitely about consolidation and budget cuts. "No further expenditures on existing business will be made."


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , ,

Geospatial Line of Business

Posted by Dave Smith On 4/17/2006 09:58:00 AM 0 comments


Tomorrow I'm going to try and attend the Geospatial Line of Business session in DC. This is rapidly approaching and rapidly becoming a huge issue with many of the folks I work with in the Agencies. Truly, this could be one of the biggest shakeups in Geobusiness currently going on in Federal governments. Essentially, the intent and implication is consolidation of Geospatial activities across agencies. Agencies which are inefficient or ineffective at supporting an identified Line of Business may find themselves no longer engaged in that Line of Business, to have that work subsumed by a more capable Agency. What is a Line of Business? In most instances, it is a vertical sector of the Agency's business. However, in some instances, it is a horizontal sector, such as Geospatial business.

The intent is in improving efficiency and effectiveness, and in raising the bar to provide geospatial centers of excellence in government. The potential upside of this is improved interoperability and a breaking down of historical stovepipes. The potential downside is that it may rip the guts out of some agencies and introduce a new layer of bureaucratic abstraction.

Care must be taken...

The FBO notification can be found here: http://www2.fbo.gov/spg/GSA/V/VC/GSV06PD00089/listing.html

Some additional supporting documents and items of interest can be found here:

I'm hoping to get together with other movers and shakers there... I am definitely interested in tracking this, working on it and providing input wherever possible, particularly as I have already been engaged in Agency-wide Geospatial segment and Enterprise Architecture work since last June. I have a good idea of what opportunities and pitfalls may be encountered.


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , ,

Still looking for another GIS developer

Posted by Dave Smith On 3/13/2006 10:10:00 PM 0 comments

Our EPA GIS Center of Excellence team did manage to get a couple more people aboard, they are starting in two weeks, but we still have room for one more GIS developer.

The main thrust of the work involves GIS web applications (such as ArcIMS-based apps), along with GIS web services (OGC WMS and WFS) and GIS database development (Oracle 9i/10g / Oracle Spatial and SDE 9.1 environment)...

A full description is available here.


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

GeoWeb 2006... signs of things to come?

Posted by Dave Smith On 3/02/2006 10:28:00 PM 0 comments


GISUser posts an update on the GeoWeb 2006 conference coming up in July in Vancouver, BC. This appears to be the burgeoning spearpoint of SOA in the geospatial world, and an event that I would just love to sit in on, given some of the work I have recently been engaged in.

The interesting thing is that the article notes Microsoft will be lead sponsor for the event. Microsoft appears to be positioning itself ever more for providing tools and the IDE for OGC-based technology.

As I continue to watch the evolution of such things as CarbonTools, which is .NET-based. As a user and fan of .NET myself, and having developed a WFS service using .NET, it makes great sense to me. Will see what the competition has to say... Oracle will be there, but no sign of ESRI. Not a good sign.

Unfortunately I think I will probably not be able to attend...

DIPEx Image Processing and Exploitation

Posted by Dave Smith On 2/11/2006 03:13:00 PM 0 comments

Yesterday I got to see a live demo of a remarkable product, DIPEx, which was developed by DataStar, which is one of our teammates on the USEPA ITS-ESE contract. DIPEx leverages NASA's ELAS software for image processing by adding web-enabled Java GUIs and also by making the entire application accessible and extensible via OGC Web Services, complete with WSDLs for easy integration. The whole system is built on Open Source technology, running on Apache, with PostGIS on the backend.

In the demo, they showed a number of features and functions, such as a full-featured web-based frontend to rival any ArcIMS-based application I have seen, with transactional WFS capabilities to edit vector data, then used the newly created vector to in turn initiate a workflow to modify an aerial image in raster and bring it back into the web view, with the multiple OGC-based vector and raster map services in various different map projections and other things going on.

Particularly given that their strength is in remote sensing imagery processing, these guys have some great things going on, and somehow have managed to fly under the radar with it. I'm hoping we can bring in their technology and capabilities to leverage some of what we are doing in our enterprise efforts...


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Architecture, MetaCarta and other stuff....

Posted by Dave Smith On 2/11/2006 08:57:00 AM 0 comments

Haven't posted in a few days, things have been very hectic. We serendipitously have synergy of many things coming together at once - an excellent opportunity to rearchitect a number of legacy GIS databases and applications for 2006, at the same time that new Federal Enterprise Architecture guidance and other technology items are arriving on the scene, such as enhanced support for OGC standards in the pipeline from Dr. Sharma and the Oracle Spatial team and from ESRI. I've been spending a lot of time over the last two weeks trying to go from high-level enterprise standpoint to solution architecture across over a dozen systems, identifying the gaps, overlaps, and opportunities for standardization, FEA realignment and reusability. This also ties in well with work I'm doing for the USEPA Geospatial Metadata/Data/Services Architecture project I've been involved in since the fall.

Apart from that, plenty of exposure on a few other things- the 2006 MetaCarta Public Sector User's Group meeting at Tyson's Corner on the 16th is going to highlight one of our recent successes in integrating MetaCarta technology into the USEPA "Window To My Environment" application. I'm hoping to attend for at least part of it.

For anyone interested in attending the 2006 MetaCarta Public Sector UG, I think they may still have some (very limited) space remaining... Details are available on the MetaCarta website, or you can contact my good friend John-Henry Gross, 703-629-0972. I note that one of the highlights is that John-Henry will be discussing the new GTS Analyst feature, as well as some enhancements in the GIS Connector and other additions and improvements.

This WME/MetaCarta application is also going to be featured at the upcoming USEPA Contractor Forum as well...




Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Carbon Project WMS, WFS and GML support for ArcGIS

Posted by Dave Smith On 1/28/2006 08:58:00 PM 3 comments

The Carbon Project, which is an effort to provide open source GIS tools for Open Geospatial Consortium interoperability using .NET, has made a new tool available for ArcGIS Desktop- CarbonArc. CarbonArc is an extension for ArcGIS that allows an ArcGIS client to access OGC-compliant Web Map Services (WMS), Web Feature Services (WFS) and Geographic Markup Language (GML).

While ESRI's Data Interoperability Extension also provides this functionality, along with offering a whole host of other interoperability options, the ability to have a simple extension for OGC-only will be good for many organizations, because the ESRI Data Interoperability Extension is not included by default in an Enterprise License Agreement (ELA). No fault of ESRI, as for DIE they are using core functionality provided by Safe Software (producers of Feature Manipulation Engine).

Back in September, in the aftermath of Hurricanes Katrina and Rita, my company was involved in developing a Web Feature Service for a federal agency to allow drilldown into a massive facilities database, from an ArcGIS client. As the backend had some limitations, we ended up developing a wrapper in .NET, which accepted WFS requests (GetCababilities responded with a series of 'themes' that we had classified the facilities into, DescribeFeatureType responded with the appropriate schema, and finally a nontransactional GetFeature which actually queried the legacy database, transformed the response, and rendered a GML response with the facilities geometry and attributes). Additional query parameters could be passed to filter facilities based on various criteria.

We initially did our testing using Carbon Project tools, such as CarbonViewer and Gaia, for lack of better WFS clients that were readily available. Then we were able to also test using ArcGIS and Data Interoperability Extension. Ultimately it worked quite well (barring the limitations of the legacy backend database), but DIE was a slight limitation for the agency, due to the licensing issues. Ultimately I felt we were a little ahead of our time, but it's great to see the rest of the world catching up.


Technorati tags:
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Search