Monday, May 25, 2015

Staying in the Zone with AMD Butler

A few months ago, I built a simple plugin for Sublime Text 3 for managing AMD dependencies called AMD Butler. Now it's hard for me to picture coding without it. If/when I make the switch to Atom this will be the first thing that I port over from Sublime.

AMD Butler is all about staying in the zone. First, let's take a look at life without it:
  1. Get a great idea
  2. Start coding
  3. Decide to add an AMD dependency
  4. Stop coding
  5. Scroll to the top of your file
  6. Remember and type the exact module id
  7. Scroll down to the factory function parameters
  8. Remember the order of the dependencies
  9. Think of what to name the return parameter
  10. Scroll back to where you were working
  11. Completely forget what you were doing
Now let's look at life with AMD Butler:
  1. Get a great idea
  2. Start coding
  3. Decide to add an AMD dependency
  4. Execute the AMD Butler add command
  5. Type the first few letters of the module id and hit enter
  6. Continue coding in the zone
This is what it looks like: 

AMD Butler dynamically crawls your existing modules and builds a quick list. It only takes a few keystrokes to find the correct one and then it automatically adds it to your list of dependencies with an appropriate associated factory function argument. All without affecting the position of your cursor. This is especially nice to use after slurping ESRI JS modules. No more scrolling, no more trying to remember module names or preferred argument aliases. Just quickly add a dependency and get back to what you were doing.

There's also commands for sorting, removing and pruning unused dependencies.

AMD Butler can be installed via Sublime Package Control. Head over to it's GitHub page to checkout the code or report any issues.

Monday, April 6, 2015

Windows Scheduler: Get Your Priorities Straight

At AGRC we have a variety of tasks (usually python scripts) that need to be run on a schedule. These are usually workflows that scrape and ETL data for web applications. Currently we use Windows Scheduler to run these scripts. Recently I've had problems with scripts taking way too long to complete. After a bit of digging I discovered that, by default, Windows Scheduler assigns a process priority of "Below Normal" to all tasks. The pain point is that they provide no UI to change this setting. After a bit of digging I found the following steps to work around this problem by hand editing the xml export of a task.
  1. Right-click on the task and export it as an xml file. 
  2. Open the file in a text editor and search for the "Priority" element. 
  3. Change the value of the this element to the desired priority level. See this page for a list of possible values. Usually 6 is what you want. 
  4. Save your changes and close the xml file. 
  5. Delete the original task and re-import the modified xml file as a new task. 
More details

Sunday, April 5, 2015

I'm Changing My Twitter Handle

For me Twitter has been almost exclusively a tool that I use in my profession to connect with others doing the same work as me and to keep up with the ever-changing landscape around geospatial software development and web development in general. I've never tweeted about what I was having for breakfast or what my crazy five kids are up to.

Twitter has been an invaluable resource for me and I work hard at doing my little part to keep the service useful and relevant. I quickly unfollow people who let their streams wander into meaningless or offensive paths that at best waste my time and at worst degrade my mind. My time is incredibly valuable to me and I try to treat others with the same attitude.

Occasionally, I feel compelled to share sometime that's not related to work but nevertheless valuable, uplifting or inspiring. It might be related to my favorite hobby, parenting or my faith. I've largely kept these to myself. In part because my twitter handle included the name of my employer and also in part because I didn't know if it would be appropriate.

However, over the past few years, I've felt that I should do more to counter the indecency of the internet conversation. In the midst of so much darkness I feel like the light that I have to share can make a real difference, that I can do a bit more to keep my tiny part of the internet out of the shadows.

For these reasons I've decided to change my twitter handle from ScottAGRC to SThomasDavis to allow me to feel more free in what I share. I don't foresee a huge change in my stream. But don't be surprised to see a bit more faith-based content. In sharing I don't mean to offend and would be happy to answer any honest questions that may arise.

Happy Easter!

Tuesday, January 20, 2015

How To Use AGRC Base Maps in QGIS

Most people know about AGRC's awesome base maps. They are very popular and provide high quality cartography using the latest and greatest data from the Utah SGID. But did you know that they provide a WMTS service that can be consumed in non-ESRI products?
Here's how to load our base maps in QGIS 2.6.1:
  1. The first step is to find the URL to the service that you are interested in. Most of AGRC's base maps are within a folder called "BaseMaps" on our main ArcGIS Server instance. Once you find the specific layer that you are interested in, copy the URL for the WMTS link at the top of the services directory page:
  2. Open QGIS and click on the "Add WMS/WMST Layer" button to open the "Add Layer(s) from a WM(T)S Server".
  3. Click on the "New" button to open the "Create a new WCS connection" dialog and add a name for the layer and the URL to the WMTS service and click "OK" to close the dialog.
  4. You should now see a new layer in the add layer dialog. Select the new layer and click on the "Add" button to add it to the map.
  5. You should now be able to view the base map as a layer in QGIS!

Bonus Tip

If you are having performance issues using our cached services through ArcMap, try loading them via these WMTS services. You can do this by double-clicking on the "Add WMTS Server" node in the ArcCatalog tree under "GIS Servers" and then pasting the same URL as above. arcmap

Tuesday, November 18, 2014

How to Wire up Travis-CI to your JS Projects

For the past six months, AGRC has been using Travis CI to automatically test and lint our projects each time we push a commit to the associated GitHub repository. Even though we run these tasks locally it's been helpful to have them run on Travis for when we miss things. It's also a major step towards automated deployments as well as running our tests via something like Sauce Labs or Browser Stack.

The set up is relatively simple. The first step is to sign into with your GitHub account. Once you're signed in you can go to your accounts page and see all of the GitHub repositories associated with your account. Switching a repository to "ON" tells Travis-CI to start watching any new commits that you push to that repository.
accounts page


The next step is to let Travis-CI know what you want it to do. The first part of this step is accomplished by creating a .travis.yml file at the root of your project. Here's an example from one of our projects:
language: node_js
  - '0.10'
  - npm install -g grunt-cli
  - npm install -g bower
  - npm install
  - bower install
    on_success: never
The lines below before_install load all of the project dependencies via npm & Bower. The notifications code just tells Travis to only send us emails when a build fails.


The second part to defining what you want Travis-CI to do is to add a scripts property to the your package.json file for your project. Travis-CI automatically runs npm test for NodeJS projects. Adding this new property to package.json defines this command. We use a special travis GruntJS task to run tasks so this is the command for us:
"scripts": {
    "test": "grunt travis -v"
The travis grunt task can contain any sub-tasks that you want. Here's what ours looks like:
grunt.registerTask('travis', [

Build Status Badge

The icing on the cake is to copy code from Travis-CI to your app's to show a "build:passing" or "build:failing" (gasp!) badge. You can do this by going to your project's page on and clicking on the badge in the upper right-hand corner of the page. Integration

After getting everything wired up you'll notice that pull requests automatically display the build status of each commit and will let you know if it is still waiting on a build to run.
still waiting
still waiting
good to go
good to go
If you want to see all of this in action you can checkout the AGRCJavaScriptProjectBoilerPlate repository.

Monday, September 22, 2014

grunt-esri-slurp - Make Your Own ESRI JS Package

I recently contributed to a blog post about a great tool for scraping ESRI's AMD build of their JS API. If you are interested in doing your own builds with the Dojo Build System and ESRI's JS API, you should definitely check it out.

Friday, March 28, 2014

Demystifying the Dojo Build System - 2014 Dev Summit Presentation

The ESRI Dev Summit this year was awesome as usual. This was my third year and it keeps getting better and better for me. I love being able to have direct access to ESRI developers and rubbing shoulders with amazing developers working on the same problems that I am. And the plenary this year was really great.

I was privileged to present again this year. My submission title was Demystifying the Dojo Build System. Here's the abstract:
If you are not using some sort of build system for your JavaScript apps, then you are missing out on some huge performance gains. Concatenation, minification, and interning strings will almost certainly shave seconds off of your page load times.The Dojo Build System is a program that can apply these types of "deployment optimizations" to your source code. However, it can be a steep learning curve and throwing the ArcGIS API for JavaScript into the mix only complicates the situation. This presentation will untangle the build system and give you a solid overview of all of the moving parts. We will explore real world examples of how the Utah AGRC uses this system in our web applications and how it can be applied to your applications as well.
It obviously struck a cord because the room was packed. My presentation ended up turning out well and I got some great feedback. I think that there are a lot of people interested in building their applications but the Dojo Build System can be intimidating (see my first slide below).

I was concerned that it would be overshadowed or rendered irrelevant by the new web optimizer that ESRI is releasing shortly and previewed at the conference. However, this was not the case. The web optimizer looks awesome and will be a huge help for a lot of people but there will always be those that want to keep the build process local and want total control over it. Hopefully my presentation will save these people some head aches.

Here's some resources from my presentation:

Summary Sheet
Example Projects