Monday, April 15, 2013

ESRI JSAPI 3.4 and the Dojo Build System

[Update: The Saga Continues]

In a previous post, I outlined how I use the Dojo Build System to optimize my web app code for production. Specifically I showed how I get around the problem of working with ESRI's ArcGIS API for JavaScript library which has already been run through the build system. However, with their recent upgrade to AMD-style module loading my handy trick of using:

dojo['require']('esri.map');

... to fool the build system into skipping 'esri...' modules imports didn't work anymore. I had no idea how to exclude modules when loading them like this:

define(['esri.map'], function (Map) { /* ... */ });

So I headed over to #dojo to see if the experts had any ideas. Fortunately for me, brianarn was there and was aware of the problem. After some brain storming, we came up with the idea to use a custom loader plugin for loading ESRI modules. Since the build system doesn't try to flatten modules that are imported with nested requires, we hoped that importing them through the plugin would solve my problem. The plugin was a relatively simple implementation:

You can put this module within any package and use it like this:

define(['app/EsriLoader!esri/map'], function(Map) { /* ... */ });

Using the plugin to load ESRI module effectively prevents the build system from trying to include them in your layer files thus allowing the build script to complete successfully. Of course, none of this hacking would be needed if ESRI would just release their source code. :)

If you find a better way of getting around this problem or have any other suggestions please let me know in the comments section below.

Thursday, June 28, 2012

How to Get Started Using Aptana Studio 3 for ArcPy Development (Screencast)

Just put together a short (10 min) video of how to set up Aptana Studio 3 and it's Pydev module for developing python scripts that use ESRI's ArcPy module. Let me know what you think!


Cross-posted on http://gis.utah.gov/developer/.

Thursday, June 21, 2012

From Tinkerer to Developer; Or How I Got My Dream Job

Today I got an interesting email from a GIS Analyst in Baltimore
Mr. Davis, 
I am a GIS Specialist who is wrapping up an MS in GIS and looking forward to applying my newly acquired skills to projects here at Baltimore City’s DOT. I recently stumbled onto your blog (http://geospatialscott.blogspot.com/) through an esri forum posting (http://forums.arcgis.com/threads/32892-Google-Streetview-and-Javascript-api) and was quite impressed with the breadth of your programming skills and abilities. 
I was asked if I could try to create a program that would display accident history reports based on intersections in a GIS. My GIS application development experience is limited to what I learned through my coursework and I do not have a programming background. My question for you is how did you go about developing your programming skills as a GIS professional? Do you know of any resources that might aid me in going from a GIS analyst to a creative GIS developer like yourself? I see a heavy need for this type of GIS Developer work here at Baltimore DOT and I would like to contribute as much as possible, utilizing all the skills and resources available to me. 
I’m sure you are super busy, but any suggestions or input would be much appreciated from this aspiring GIS(lowercase)p. 
Thank you in advance, [name withheld]
This caused me to reflect upon the past few years and how I've arrived at such an awesome job. I work for the Utah AGRC as a Geospatial Developer and I love my job. There are very few days that I'm not excited to get to work and create something cool. Every day I'm challenged by new problems to solve through code.

I've worked in my current position for almost two years and am just now becoming comfortable calling myself a developer. This is because I'm entirely self-taught and am naturally self-conscience about my knowledge and skill as a developer. My formal education was in Geography with an emphasis in GIS; no computer science classes at all. I started my career as a GIS Analyst working for several local municipalities.

Here are a few things that I think have contributed to my transformation from a GIS Analyst that had no programming experience to a Geospatial Developer.

30 Minutes Per Day

At the beginning of every GIS job that I have had I have always asked my supervisor if he would allow me to spend 30 minutes per day on learning something new. Universally the response has been, "Really!? You want to better yourself!? Heck yes you can." OK, that may not be the exact wording, but you get the idea. My managers have always been happy to give me that time. I believe that it's because they see it as an investment in their employee.

You Gotta Love it Baby

In order to have the motivation to teach yourself something as complicated and frustrating as programming you have to enjoy it. Find something that excites you and then learn about it. If you are not excited to learn about it, you better move on to something else.

Learn By Doing Real Work

It's important to me to learn by doing something related to a real project rather than a demo or example project. This helps me stay invested in it as well as lets me know if this technology will really work for my environment. For example, recently I've been reading about backbone.js. Instead of trying to work through an example project that has not relation to anything that I would ever build, I've been reading through the examples and trying to translate them into one of my current projects. By doing this I'm finding that backbone may not be the best solution for my projects. I'm not sure that I would have come to this conclusion as quickly if I had buried myself in demos. Demo's are super useful for showing me how something works, but if I'm writing something, I want it to be connected to my world.

Find a Yoda

Having someone smarter than you to whom you can ask questions is a necessity. I really feel like this is what has made the difference for me. It was only when I was able to get past my own self-conscientiousness and ask smart people questions that I really felt like I made progress. 

Fortunately, in my experience, most programmers are more than happy to give you a little advice and point you in the right direction. I've had several 'famous' JavaScript people respond to my questions on twitter within minutes.

The best ones won't give you the answer right away, but will give you just enough info for you to find the answer on your own. @SteveAGRC is a master at this and has been a great mentor for me. As he would say, "you don't learn anything by keeping your mouth shut."


So I hope that this is a beginning to an answer for my new friend in Baltimore. Maybe a later post will be about specific languages and technologies that I think are the best to learn (JavaScript & Python). If you have any other suggestions for this aspiring developer, please leave a comment.

Tuesday, April 24, 2012

Phonegap + Leaflet + TileMill = Offline Mobile Maps


Recently I've been researching a mobile project that will require offline-capable base maps. After unsuccessfully finding a solution already built I decided to try wire together Phonegap, Leaflet, and TileMill's .mbtiles. After a few late nights I was able to see it come together and sent out a quick tweet. This has generated quite a few requests for more information so I threw up a repo on GitHub to demo what I was able put together.

The main idea is to download the .mbtiles to the device using Phonegap's File API. Since .mbtiles are just sqlite databases I was able to use a SQLitePlugin to open them up (thanks, coomsie!). I did run into a problem getting this plugin to read BLOB fields from the .mbtiles database. However, after a bit of poking I was able to get it to work.

Once I had access to the encoded image data it was only a matter of writing a custom leaflet tile layer (TileLayer.MBTiles.js) inspired by a similar one that coomsie had done. One of the big secrets was passing {scheme: 'tms'} into the constructor.

Initially the performance was quite good on the iPad 2 and the iPhone 4 that I tried it out on. However, after cleaning up the project in preparation for uploading it to Github, the performance has suffered a bit. Not sure what I did, but I'll post an update when I get it figured out.

This is my first experience with Phonegap and Objective-C so any suggestions for improvements would be greatly appreciated.

[Added on 4-26-12]
P.S. The app downloads the .mbtiles file automatically from my dropbox account the first time that it runs and stores it in the Documents folder locally. Each subsequent time that it runs it uses this local file. So after the initial download you should be able to open up the app and see tiles while in air-plane mode.

Monday, June 6, 2011

Using The Dojo Build System To Speed Up Your ESRI JavaScript API Apps

[Updated 4-15-2013]
I've come up with a solution for ESRI JSAPI 3.4 and AMD.


As your JavaScript projects get more and more complex, loading all of those Dojo classes can really slow down your load time. All those dojo.require calls add up in a hurry. The Dojo Build System can be a huge help in speeding up the load time and general performance of your apps. For example, a build that I ran on a recent project took the number of JavaScript requests on page load from 53 down to 5. The css request went from 16 down to 4. This ended up cutting the load time in half! Other nice features include stripping out all of the console calls, minifying your JavaScript and interning all of your widget templates.

The rest of this post assumes some familiarity with the Dojo Build System. If you haven't looked at it before, the documentation is worth reading. There's even a fancy new tutorial.

After reading all of the Dojo documentation it's easy to get excited about the possibilities. However, you will quickly find that mixing the ESRI api into the equation makes a big mess of everything. For example, the dojo build system assumes that you are hosting everything yourself. But because ESRI has not released a source/unbuilt version of their api that we can download we are stuck loading Dojo from their servers. The other problem is that when you load the ESRI api you are really loading their layer file which can have a lot of overlap with your layer file thus adding a lot of duplicate code. Not to mention the problems that the build system has when it sees: dojo.require("esri..."); and it doesn't know where to get it. Over the last few months I've developed a solution to overcome these problems and end up with a lean and mean (for the most part) product in the end.

Wednesday, May 18, 2011

Python Script To Update Current Stream Gauge Data In New AGRC Flood Map

We use a python script to scrape data from the USGS and NWS web sites to update our data in the SGID. It runs every two hours through Windows Scheduled Tasks. The script’s workflow is as follows:

First it loops through all of the features in our stream gauges feature class (SGID93.WATER.StreamGaugesNHD).

For each feature, it uses a USGS id (SourceFeature_ID) to build a url to hit their Instantaneous Values web service

# get json object
data = json.loads(urllib2.urlopen(r'http://waterservices.usgs.gov/nwis/iv?format=json&site=' + id).read())


This is an example of one of the urls: http://waterservices.usgs.gov/nwis/iv?format=json&site=09413700. It then uses the json library to parse the data and get the values that we are interested in. These values are used to populate the appropriate fields in our feature class.

def getJsonValue(variableCode, data): for ts in data['value']['timeSeries']: if ts['variable']['valueType'] == variableCode: value = ts['values'][0]['value'][0]['value'] return value

The NOAA data is served up via an rss feed which means xml. The minidom object from the xml.dom library came in handy here for parsing the xml data.

# get noaa data gaugeID = row.getValue('GuageID') if gaugeID:     ndata = minidom.parse(urllib2.urlopen('http://water.weather.gov/ahps2/rss/fcst/' + gaugeID.lower() + '.rss'))     descriptionText = ndata.getElementsByTagName('description')[2].firstChild.nodeValue     descriptionList = descriptionText.split('<br />')     row.setValue('HIGHEST_FORECAST', descriptionList[5].split()[2].strip())     row.setValue('HIGHEST_FORECAST_DATE', getNOAADate(descriptionList[6].split('Time:')[1].strip()))     row.setValue('LAST_FORECAST', descriptionList[8].split()[2].strip())     row.setValue('LAST_FORECAST_DATE', getNOAADate(descriptionList[9].split('Time:')[1].strip()))
So in the end we have one feature class that combines real-time data from multiple sources. You can check out a copy of the script here.

Wednesday, March 16, 2011

ArcPy.Mapping Module Makes Complex PDF Creation Easy

Recently, I was presented with a problem that was a perfect opportunity for trying out ESRI's ArcPy Python site package. We have a series of map documents that are set up with Data Driven Pages to export various maps for each of the counties of Utah. Each mxd had a different theme. The goal was to develop a script that would export all of the Data Driven Pages for each mxd and then combine them by county. After a few hours of work I had 72 lines of code that did just that. Here's what I came up with:

I used only two modules for this script: arcpy.mapping and os (great for working with the file system).
# import modules import arcpy.mapping, os # variables baseFolder = os.getcwd() # current working directory outputFolder = baseFolder + r'\PDFs'


The os module was great for deleting the old files and getting a list of the map documents.
# clear out old pdfs print '\nDeleting old PDFs...' oldPDFs = os.listdir(outputFolder) for f in oldPDFs: os.remove(outputFolder + '\\' + f) # get list of all files in the folder print '\nGetting list of mxds...' allItems = os.listdir(baseFolder) # filter out just .mxd's mxdFileNames = [(x) for x in allItems if x.endswith('.mxd')] mxdFileNames.sort()


The DataDrivenPages class was the key class in the ArcPy.Mapping module for this script. It is obtained through the MapDocument class. Here I start to loop through the mxd's and get a reference to the DataDrivenPages object that I am interested in.
# loop through mxds for name in mxdFileNames: print '\nProcessing: ' + name # get mxd mxd = arcpy.mapping.MapDocument(baseFolder + '\\' + name) # get datadrivenpages object ddp = mxd.dataDrivenPages


Once I've got the DataDrivenPages object, then I start to loop through all of the pages.
# loop through pages pg = 1 while pg <= ddp.pageCount: # change current page ddp.currentPageID = pg # get name of current page name = ddp.pageRow.getValue('NAME') print name


Before I export the page, I check to see if there is already an existing pdf for that particular county. If there is I export the page to a temp PDF file and then use the PDFDocument::appendPages() function to add it to the existing PDF. If not, then I just export it out to a new PDF.
# check to see if there is already a pdf file created for this county pdfFile = outputFolder + '\\' + name + '.pdf' if os.path.exists(pdfFile): print 'Existing pdf found. Appending...' # open PDF document pdf = arcpy.mapping.PDFDocumentOpen(pdfFile) # output to temporary file tempFile = outputFolder + '\\temp.pdf' ddp.exportToPDF(tempFile, 'CURRENT') # append to existing file pdf.appendPages(tempFile) # delete temp file os.remove(tempFile) # clean up variables del pdf else: # file does not exist, export to new file print 'No existing pdf found. Exporting to new pdf.' ddp.exportToPDF(pdfFile, 'CURRENT') # increment page number pg = pg + 1


Then, all that's left if a little clean up.
# clean up variables del mxd, ddp raw_input('Done. Press any key to exit...')


And that's it! Here's the entire script and an example output pdf.