Tuesday, November 14, 2017

Photo Interpretation and Remote Sensing Module 10

This module's focus was the last method of image classification, supervised classification, which involves identifying known examples of different LULC classes so that the software can associate a spectral signature with each one and classify the remaining pixels based on their statistical similarity to the training classes. It's more work upfront than unsupervised classification, but there's the advantage of not having to go back and figure out what the classes are after the fact.

This was another fun lab assignment, but also kind of frustrating. Despite several attempts, I had a very hard time establishing a unique spectral signature for the road class, and as you might be able to see in my map below, even the final output has some confusion between the road and urban classes, and possibly also road and grass in some places.

You can also see from the distance output image (in which brighter areas correspond to areas that are more different from the established spectral signatures and thus more likely to end up misclassified) that there might be some issues with some of the agricultural areas and along the edges of certain features like the lakes. If this map were for a real-world application, those areas might need to be ground truthed or compared with other imagery to see if they need to be reclassified.


Monday, November 6, 2017

Photo Interpretation and Remote Sensing Module 9

This week we're back to land use land cover classification from aerial imagery, this time using automated methods. In this lab, we learned how to perform unsupervised classification in both ArcMap and ERDAS Imagine, culminating in a classified image of the UWF campus. For this assignment, the image was automatically classified into 50 different classes based on pixel characteristics, then manually simplified into five classes by identifying the broad LULC category each of the original 50 classes. My output is below; I ended up with a significant portion of the image falling into the "Mixed" class, which consists of pixels whose automatically assigned spectral class spanned multiple LULC classes (most of the confusion seemed to be between grass and buildings/roads--when I started out by assigning classes to the grass category, I suddenly had a number of green roofs!). 



Tuesday, October 31, 2017

Photo Interpretation and Remote Sensing Module 8

Another fun lab this week! After learning about thermal energy, how to work with thermal infrared imagery in both ArcMap and Erdas (as well as how to compile multiple image bands into a single image in both programs), and some applications for thermal image analysis, such as identifying forest fires, we were asked to use the thermal band in one of the lab images to identify a feature of our choice.

I looked at the image from Ecuador using a panchromatic display of the thermal band in comparison to a true color composite and a panchromatic display of the near-infrared band. There were several things in the image that intrigued me (some of which I actually wasn't able to identify with certainty), but I eventually noticed this pair of bridges that only popped out at me when looking at the thermal band--in the visible and near-infrared bands, the one on the left is subtle and the one on the right is almost invisible, because their coloring isn't that far off from the water and both pavement and water absorb strongly in the near-infrared. But pavement certainly gives off thermal energy after warming up in the sun!

Anyway, I thought it was really interesting, because I wasn't expecting such a stark difference in what seems like it should be a prominent feature. There are certainly benefits to comparing multiple image bands.


Tuesday, October 24, 2017

Photo Interpretation and Remote Sensing Module 7

Lots of imagery and maps this week! We learned about working with multispectral imagery in both ERDAS Imagine and ArcMap. For the lab assignment, we were given a Landsat Image and asked to identify three features based on clues about their reflectance in different bands of the image, i.e. at different wavelengths. Then we had to create composite images using band combinations that would make those features stand out.

Here are mine:



One issue I had was that the first image (water bodies) looked very washed out once I brought it into ArcMap and I couldn't figure out how to fix it, although I was able to do some histogram correction using the Image Analysis window, just not to get it looking as good as it did in Imagine. What's puzzling is that it didn't happen with the other two, so if it's related to creating the image subset, I don't know why it only caused a problem with one. Aside from that, I enjoyed this lab!

Tuesday, October 17, 2017

Photo Interpretation and Remote Sensing Module 6

This week we started learning about image enhancements that can be used for things like reducing noise and atmospheric effects and emphasizing or de-emphasizing certain features or details. The lab assignment involved experimenting with a number of enhancement tools in both ERDAS Imagine and ArcMap. For the final deliverable, the goal was to minimize the striping (caused by a problem with the sensor) in a Landsat 7 image while preserving as much detail as possible. Here's my attempt:


I had a hard time with this because I don't feel like I understand enough about how to use the different filters to be able to approach a problem like this strategically--I was just sort of throwing options at the image hoping something would work, and not surprisingly I'm not very happy with the result. Hopefully we're going to continue working on how to properly apply image enhancements in the coming modules, now that we know the basics of how to use the tools.

Tuesday, October 3, 2017

Photo Interpretation and Remote Sensing Module 5a

This week we learned the basics of how to view and navigate images in ERDAS Imagine. I've used this program before, but so long ago I don't really remember, so this is helpful, especially as I don't find it to be particularly intuitive to use. For the lab assignment, we had Imagine calculate the area covered by each type of land cover in a classified image, then extracted a subset of the image to create a map layout in ArcMap.


Tuesday, September 26, 2017

Photo Interpretation and Remote Sensing Module 4

This week we learned about using ground truthing to confirm land cover/land use classifications, and about calculating the accuracy of an LULC map base on ground truthing. Since we couldn't actually visit our sample sites for the lab assignment, we used Google Maps and Google Street View to confirm/correct classifications. Below is my map from last week, overlain with "ground truthing" points color-coded according to whether or not my original classification was correct at that location. Some points were more of a challenge to identify than others, particularly a few that had obviously changed between the time the photo was taken and the time the Google Maps imagery was last updated.


Tuesday, September 19, 2017

Photo Interpretation and Remote Sensing Module 3

This week was all about land use and land cover classification using aerial photos, and the lab assignment was to visually classify an image. We only needed to classify to Level II (out of four levels), so it's much less detailed and specific than it could be. Even so, it takes a lot of time to examine the image and identify various features to determine in which category they belong. 

My final map is not as thorough as I would have liked, but my time was limited. Here's what I came up with:


Tuesday, September 12, 2017

Photo Interpretation and Remote Sensing Module 2

We're starting off the course on Photo Interpretation and Remote Sensing by examining aerial photographs, and this week we specifically looked at different methods of identifying the features in an image. 

The image below explores the range of tones and textures that can exist in a single photograph. Tone and texture can help in identifying a feature.


Features can also be identified by their size and shape, by the shape of the shadows they cast, because they form a pattern, or based on their association with other identifiable features. The image below shows some examples.


Wednesday, August 9, 2017

GIS Programming Module 11: Sharing Tools

This week was focused on different ways of sharing custom script tools for ArcGIS, and for the assignment we were provided with a script tool and asked to make some updates and then embed the script in the tool. This allows the toolbox to be shared without the separate script file, and also allows the creator to password-protect the script so that other users can't copy or edit it.

The script tool was already partially functional, but needed some hard-coded variables to be replaced with coding that accepts user input from the tool interface. Then we needed to add descriptions for each of the tool parameters, and finally embed the script in the tool and set a password.

This screenshot shows both the tool interface and the output of an example from the lab using 50 points and distances of 10000 meters:



This is the last week of class, and I've had a great time and learned a ton. The most exciting and immediately useful thing I learned during this course is how to write geometries using a Python script, which I used in my final project to solve a real-life problem. I also found the lessons on turning a model into a script and turning a script into a custom tool to be very interesting, and I already have plans to apply those concepts outside of class as well. I also think it’s very cool that you can use Python scripts to manipulate GIS files and even map documents without ever even opening ArcMap.

Wednesday, August 2, 2017

GIS Programming Module 10: Custom Tools

 This week we learned how to convert a standalone Python script into a tool that can be run like any other ArcGIS tool, which makes scripts more versatile and shareable.

Assuming you've already written a script, the steps to turn it into a customized tool are:
create a new toolbox in ArcGIS > add the script to the toolbox and give it a name/description/etc. > set the tool's parameters > modify the script to allow it to accept user input for the parameters > change any print statements to arcpy.AddMessage (this makes them print to the tool progress window in ArcMap)

Here is the tool window for the script tool from this week's assignment (don't worry about the red Xs; ArcMap is just warning me that those default file paths don't exist on my local computer):


And here is the results window after running the tool successfully:


This was an easy lesson, but an incredibly helpful thing to know, since tools are so much easier to use and share, and better integrated into ArcGIS, than standalone scripts.

Wednesday, July 19, 2017

GIS Programming Module 8: Working With Geometries

This week we learned how to use Python to access geometric elements of features, like their shape type, coordinates, and length/area. For the assignment, we were tasked with writing a script that would iterate over all parts of all features in a shapefile and write the coordinates of each vertex, along with ID numbers and the name of the feature, to a separate text file (created within the script).

The first step is to create a search cursor to find the necessary data > then create a writable text file > then loop over each feature > and for each part of each feature, print the vertex coordinates and other data to both the text file and the interactive window.

Here's the pseudocode I wrote to show how the script works:

Cursor = OID, Shape, Name in rivers.shp
Text = rivers.txt
For row in cursor:
     Vertex = 0
    For point in row
        Vertex + 1
        Write to text OID, Vertex, X and Y coordinates, and Name
        Print OID Vertex, X and Y coordinates, and Name

And here's part of the text file my script produced:


For me, this was definitely the hardest assignment yet, but I was really looking forward to this lesson and I'm glad to be starting to understand how manipulating geometries works.

Wednesday, July 12, 2017

GIS Programming Module 7: Manipulating Spatial Data

This week we tackled some new data structures and the use of cursors to find and edit data in tables. The script for this week's assignment had a number of steps:

create a new geodatabase > create a variable containing a list of all the feature classes in the module data folder > copy each shapefile from the module data folder to the database > create a search cursor that retrieves name, feature, and population for only the county seats from the "cities" shapefile > create an empty dictionary > add the name and population of each city to the dictionary > print the dictionary

It also prints messages at each step to let the user know what's happening and whether the script is running correctly, as you can see below in a screenshot of part of my script's output:


The hardest part for me this week was writing the search cursor correctly and then the right code for adding data to the dictionary. At one point after sorting out the latter, I was still getting mysterious error messages (I was running lines of code in the interactive window as I went so I could see if they worked before writing them into the standalone script) and it turned out I still had a problem with the search cursor. It's always syntax issues for me!

Wednesday, June 28, 2017

GIS Programming Module 6

This week we began working with arcpy and learning to code using geoprocessing functions. The lab assignment was to write a short script that calls a series of geoprocessing tools and prints the resulting messages (the same ones that would appear in the geoprocessing results window of ArcMap).

Specifically, it: adds XY coordinate data to the "hospitals" shapefile (Add XY tool) > prints messages > creates a 1000 meter buffer around features in the "hospitals" shapefile (Buffer tool) > prints messages > dissolves the individual buffers into a single feature (Dissolve tool) > prints messages

Here are the results in PythonWin when the script is run:


Tuesday, June 20, 2017

GIS Programming Module 5: Geoprocessing

This week we looked at geoprocessing using ArcGIS tools, as well as ArcGIS's Model Builder. I've encountered Model Builder before, but learning that you can use a model as the base for a standalone Python script (and then, a Python script as the base for a tool in ArcGIS) is a game-changer. I've already thought of a real-world application to make my life easier outside of class.

Anyway, for this week's lab assignment, we were given several datasets and asked to make a model and then a Python script. I liked doing a model first instead of diving right into scripting, because the models are set up like flow charts, and it's a good way to visualize all the steps you need to go through to get the desired result. From there, it's easy to finalize the model, then export it to a script and finalize that so it can run independently.

For this assignment, it went something like this: all map layers as variables > clip each variable to "basin" layer shape > from clip output for "soils" layer, select all that are not prime farmland > use selection result to erase areas of basin layer

So the final output, of both the model and the Python script, is a shapefile of the basin layer minus parts that aren't prime farmland, like this:


Wednesday, June 14, 2017

GIS Programming: Peer Review 1

Article Citation:
Cerrillo-Cuenca, E. (2017). An approach to the automatic surveying of prehistoric barrows through LiDAR. Quaternary International, 435, 135-145. Retrieved June 14, 2017, from http://www.sciencedirect.com/science/article/pii/S1040618216002159

The author of this paper proposes a methodology for automated identification of prehistoric barrows (burial mounds) in western Europe using Python code to systematically analyze publicly available LiDAR data. Essentially, he has invented a method of creating a predictive model to help identify unknown barrow sites. 
Cerillo-Cuenca argues for the significance of his methodology on the basis that understanding the spatial patterning of barrows, particularly their relationship to settlement sites, is essential for understanding other aspects of prehistoric societies. Remote sensing and systematic survey are an efficient way to gain a better understanding of large areas, especially understudied ones, but, according to Cerillo-Cuenca, other methods used to assess LiDAR data rely on visual interpretation, which has many limitations.  
In the method outlined in this paper, the imagery is first processed to create a bare-earth topography model, which is then analyzed by a series of algorithms to identify locations that match the characteristic topography and shape of barrows, and return coordinates of those locations so that they can be confirmed, either on the ground or from aerial imagery.
I appreciated that Cerillo-Cuenca tested his method in a controlled way by comparing predicted locations to previously recorded sites, and it does appear that the process is fairly successful in identifying well-preserved barrows. It does not do well with poorly-preserved barrows, which is to be expected, but it also seems to consistently miss barrows that are smaller than average. The author acknowledges this shortcoming and suggests the use of higher-resolution imagery, with the possible trade-off of getting more false identifications—of which there are already quite a few based on this article. I would like to read more about how both of these issues might be mitigated, although, to be fair, it sounds as if the methodology is still being refined at this time. I’m also curious as to how the author would make the case that this is a significant improvement over other LiDAR survey methods (it’s likely faster, although I don’t think he explicitly says so), given that it also has clear limitations and still requires review and confirmation of predicted locations.   
On a more immediately practical level, the paper could also benefit from a clearer (and dare I say simplified?) explanation of the process used to identify potential sites, which was a bit over my head as someone with only a rudimentary understanding of this kind of analysis.  
Overall, though, this is an innovative approach that reflects the trend in archaeology towards greater use of technologies like remote sensing and digital modeling, and any method—particularly one that can be automated—for identifying new sites has big implications not only for research but for risk assessment and preservation. No predictive model can be perfect, and this one does seem to need more work, but it sounds very promising.

GIS Programming Module 4: Debugging and Error Handling

This week's programming topic was debugging, or finding and fixing errors in code, and error handling, which means writing code that assumes certain kinds of errors may occur and "traps" them, allowing the code to complete anyway.

For the assignment, we were given three pre-written Python scripts containing errors, two of which we had to fix so that the code would run correctly, and the last of which was an exercise in error handling. All of the scripts this week use ArcGIS geoprocessing functions, so this was also my first official exposure to how Python and ArcGIS interact.

The corrected first script takes a shapefile and prints a list of the field names in the attribute table. It works something like this: import ArcGIS modules > assign workspace (map document) > assign feature class > make list of fields > for each item in fields list, print field name

Here are the results:


The second script is a bit more complex and was harder to fix, with more errors and more complicated ones. (I also ran into some added complications on this script and the third one because I'm working from my local computer rather than the UWF GIS remote desktop and thus have to change the file paths in the scripts, which introduces opportunities for new errors!) I'm not sure about everything this script is doing, because we haven't learned much about the geoprocessing functions yet, but in the end, it prints a list of the layers within a specified data frame:


I was nervous about the error handling part of the assignment, but it turned out to be very easy--a generic try-except statement rather than trying to deal with specific kinds of errors separately. The third script has two parts, the first of which is supposed to turn on the visibility and labels for a layer within an ArcMap document, but is written so that it raises an exception. I enclosed most of the code for this section of the script in the try-except statement, which means that the script now prints "Error:" and the error message for this first part, before continuing the run the second part, which prints the name, spatial reference, and scale for the specified data frame:


Another issue I had this week is that I was using IDLE as my Python editor instead of the recommended PythonWin, and I have two versions of it on my computer, a standalone copy and the one that installs along with ArcGIS. The standalone version doesn't recognize arcpy, the module containing the GIS-specific functions, but it is the one my scripts open in by default, so I kept getting error messages about arcpy that weren't part of the assignment. So frustrating! But at least I figured it out, and now know that if I want to use IDLE, I have to open the right version. Or maybe I should just stick to PythonWin.

Wednesday, June 7, 2017

GIS Programming Module 3: Python Fundamentals, Again

We're still going over the basics of using Python, and this week's topic was conditional statements and loops, which are some of the building blocks of more sophisticated code. For the lab assignment, we were asked to complete a partial script. The finished script prints out the results of a dice-throwing game. Then it generates a list of random numbers, removes any instances of a specific number from that list, and prints the final result. Here's the output:


The trickiest things for me this week were remembering the names of the functions and methods I want to use (I have a problem with knowing what Python can do but not how to do it) and remembering basic details like converting integers to strings in a print statement and adding spaces between things that are going to be printed. I also briefly forgot that the remove method only takes out the first instance, not all of them, and had to look that up to figure out the right code to finish the script. But overall, so far so good.

Saturday, May 27, 2017

GIS Programming Module 2: Python Fundamentals

This week in GIS Programming was a crash course (for me, a welcome review) of the basics of Python syntax and how to manipulate different data types. The lab assignment was to write a short script that began with a string containing my full name, extracted the last name, calculated its length, and ultimately returned the last name and a number three times the number of characters it contains. This output is pictured below:


The steps in the coding process went something like this: assign full name to a variable --> split string into list --> print the last item in the list --> calculate the length of the last item in the list --> multiply it by 3 --> print that result.

Tuesday, May 23, 2017

GIS Programming Module 1: Introduction to Python

Not much to report from the first week of GIS Programming. This week was a very basic overview of the Python scripting language and how the course is going to work. For the first assignment, we were given a prepared Python script to automatically set up a folder tree to be used for file storage throughout the course. All we had to to was open it in PythonWin and run it, but it was pretty neat:


12 folders with more folders inside them, and all I had to do was click a button.

I took some programming classes just for fun a few years ago, so I'm pretty excited to learn more about using Python this semester, and especially how to apply those skills to GIS. 

Wednesday, May 3, 2017

Cartographic Skills Final Project

For very last Cartographic Skills assignment, we were tasked with making a publication-quality map using two thematic datasets and all of our skills from the past semester. I opted to find my own data and make a map with a political theme. Following in the footsteps of other mapmakers who've created election results maps that show politically divided states or counties as shades of purple rather than just the red and blue of the winning party, I mapped the county-level results of the 2012 presidential election using a red-purple-blue color scheme. Then I overlaid cities and towns symbolized according to their population size as of the 2010 census. The point of this was twofold: illustrating the variation that underlies the question of red or blue (in some counties the win was by a large margin; in others it was extremely close), and comparing voting patterns with population distribution in an attempt to investigate that so-called "urban-rural divide" plaguing modern politics.

I focused on Washington, DC and the adjacent states of Maryland and Virginia, partly because it's a region I know well and partly because DC is the capital, and is a major city in region that has both other major cities and really rural areas, so what better place to start looking at how election results map out? If you're wondering why I chose the 2012 election rather than 2016, it's because 1) the data was easily available, already compiled and published by someone else, 2) it happened closer to the most recent census, so the population data seemed more relevant, and 3) 2016 had a lot of surprises and I wanted to start with something that might be a little more typical.

As you can see below, there's a lot of red in those "blue" states, especially in the west, and there's also a lot of purple, which indicates counties where there a lot of Republican AND a lot of Democratic voters. The actual value being mapped is the difference (in percentage points) between the major candidates' shares of the total vote. You can also see that there's something going on with regard to urban areas trending blue and sparsely populated ones trending red, but there's also some areas that don't appear to fit that trend, which is interesting.


Both the main map and the inset maps, which provide close-ups of the most crowded urbanized areas, were created in ArcMap, and the fine-tuning and final layout were done in Adobe Illustrator. I used a choropleth map for the election results, because color is the obvious way of illustrating election results and a choropleth map made a good base to overlay the population data on. The data was manually classified to help simplify a kind-of-confusing legend by using nice round numbers to group the data, although the classes are similar to what ArcMap came up with using natural breaks. Eight classes is kind of a lot, but I wanted to show us much variation as possible rather than just a few classes with huge data ranges (there's a big difference between a 5-point margin and a 30-point margin, after all), and eight classes seemed about the upper limit of what would keep those classes visually distinguishable. For city and town populations, my stand-in for population distribution, I used proportional symbols, which worked better than classed graduated symbols given the huge range in population sizes. I opted to only display cities and towns of at least 5,000 and only label a selection of those, lest the map get too complicated, and I also made the city symbols semi-transparent to help minimize the degree to which they obscure not only each other but the underlying choropleth layer. Finally, I made the city symbols a contrasting color, and made sure everything else in the layout was grey or black to ensure that both bright-colored datasets really stand out.

This project was a lot of work, but I'm please with the result. I've really enjoyed this class and have learned a lot about good map design.

Sunday, April 30, 2017

GIS Final Project

The very last assignment for Intro to GIS was an assessment of the Bobwhite-Manatee Transmission Line, which was constructed several years ago in Manatee and Sarasota Counties in southwest Florida. The object of the final project was to take the preferred corridor for the transmission line and analyze its impact on the environment and communities to determine whether it met the criteria its placement was supposedly based on.

The criteria assessed for this assignment were selected from a list used in the real-life planning of the transmission line. They were:

Does the preferred corridor avoid large areas of environmentally sensitive land? (Conservation lands and wetlands were analyzed for this project.)
Does the preferred corridor avoid homes? (This analysis involved digitizing possible houses from aerial imagery, as well as calculating the number of land parcels impacted.)
Does the preferred corridor avoid schools/daycares?
And is the preferred corridor feasible in terms of length and cost?

All analysis was performed in ArcMap. Analysis of each of the first three criteria primarily used either the Clip tool (with the preferred corridor as the clip shape) or a Select By Location query to identify impacted features. For the second and third criteria, a 400 foot buffer around the preferred corridor was created in ArcMap in order to include features in proximity to the transmission line as well as those directly impacted by the corridor. Length and cost were calculated based on the estimated centerline of the corridor, using ArcMap's measurement tool and a provided equation, respectively.

For more details and results (not to mention the maps), my final presentation and write-up are available online:
PowerPoint
Presentation Transcript

The short version: The transmission line is an estimated 25 miles long and impacts a fairly minimal amount of sensitive land, a relatively low number of homes and parcels, and no schools or daycares. In other words, it basically does a good job of fulfilling the criteria.

Monday, April 17, 2017

Cartographic Skills Module 12: Neocartography and Google Earth

In the final module for Cartographic Skills, we discussed the changing nature of cartography in the 21st century, given advances in technology, new applications, and especially VGI, or volunteered geographic information, meaning spatial information submitted by the public as opposed to collected and compiled by professional cartographers. 

For the lab assignment, we looked at ways to explore and share data using Google Earth, which is, of course, freely available to the public and relatively easy to use. The first part of the lab was to return to the dot density map created two modules ago and export it in KML format. KML files can then be opened in Google Earth, overlaid on the standard Google Earth imagery, and shared with other Google Earth users. As a method of simply viewing map layers, this is much more accessible than ArcMap.


In the second part of the lab, we used Google Earth's tour feature to record a tour of places in south Florida. Google Earth allows users to create placemarks that make it easy to zoom in or out and to transition from one view to another, so that you can navigate to a series of places easily. After you've recorded a tour, you can save it along with your map layers and export a package that can, again, be shared with other Google Earth users, who can play the tour as well as explore your map on their own. 

The record tour function automatically records every move you make while navigating around the map, so I found it a little tricky to do, as I had a hard time zooming, rotating, and panning smoothly while looking around a stop on the tour. Trying to get the scenes you want without giving the viewer motion sickness might take some practice! My tour for this assignment definitely wasn't perfect. Still, it's a very cool feature that I didn't know how to use before, and I'm now full of ideas for how to use it  in the future for sharing maps and places.

Friday, April 7, 2017

Cartographic Skills Module 11: 3D Mapping

This week we studied 3D mapping in Cartographic Skills as well. For the first section of the lab, we went through an ESRI training course on the basics of visualizing 3D data in ArcScene, including data types, how to set base heights and extrude features, and how to manipulate lighting. For me personally, the hardest part about adjusting to 3D scenes is how to navigate them successfully, but that's probably due at least in part to the fact that I'm currently working on a laptop with just a trackpad. I assume you get better control with a mouse!

For the rest of the lab assignment, we worked with data on building footprints in downtown Boston. Given the building footprints and a LiDAR dataset of the same area, we extracted height information for the buildings from the LiDAR file by taking a series of sample points and then using the average z value for each building footprint as the height for that building. With that data, we could display the buildings in 3D in ArcScene. Then we exported that layer as a KML file and opened it in Google Earth. The end result looked like this:


3D mapping is useful for urban planning, simulations, modeling industrial activity, visualizing environmental changes, or any application where it's helpful to view terrain in a similar way as it appears in the real world. It can also be helpful in viewing multiple datasets simultaneously, whereas the same map in 2D might get too crowded or complex to be useful. 3D maps can also be used to display data with z-values whose meaning isn't inherently spatial. For example, things like temperature, population, and property values can be visualized in 3D.

Some advantages of 3D maps are their versatility and the ability to explore complex spatial relationships, as well as to simulate viewsheds/lines-of-sight and what an area would look like at different times of day or year. However, care has to be taken to display height or depth data accurately, or to make it clear what information is being presented in the case of data like property values, as it could be easy for the viewer to misunderstand what they're seeing. It can also be hard to see certain features if the map covers a large geographic area or if some features block others from view. This problem is somewhat avoidable if the viewer has the ability to zoom in and out and navigate around the map, but if the map must be viewed outside of ArcGIS, exporting multiple static views from different angles or scales might be necessary to show a complete picture of the data.

Thursday, April 6, 2017

GIS Week 13: Georeferencing and ArcScene

For the final module in Intro to GIS, we got to explore georeferencing and mapping in 3D. For this lab assignment we were given two aerial images of the UWF campus that had no spatial reference, and georeferenced them (in other words, assigned real-world coordinates) based on existing shapefiles of campus buildings and roads. We then edited those shapefiles to digitize two features (one building and one road) that were present in the photos but not yet in the shapefiles. The results are below, along with a second map showing the location of UWF's eagle nest in relation to those features. This second map also shows the easement buffers created to protect the nest, which were drawn using ArcMap's multiple ring buffer tool.


 After all that, we took our first stab at using GIS in 3D. The building and roads shapefiles and the aerial imagery were imported into ArcScene and draped over a 30-meter DEM. The buildings were also extruded based on the heights listed in their attribute table, and then the 3D aspect of the entire scene was exaggerated somewhat to make the features stand out more. The final scene was exported as an image so that the layout could be finalized in ArcMap, as ArcScene doesn't allow for the creation of additional map elements like legends.

Sunday, April 2, 2017

Cartographic Skills Module 10: Dot Mapping

This week we explored dot mapping, which uses small dots to indicate occurrences of specific geographic phenomena. Each dot represents a set number of instances of the phenomenon being mapped, and the number of dots in an enumeration unit is proportional to the number of real-world phenomenon in the same area. Dot mapping is a good way of visualizing spatial distribution, as it brings out patterns nicely, especially if care is taken to place dots in areas where the phenomenon in question is most likely to occur, by using other geographic features that affect the phenomenon as a guide to dot placement.

As an example, the lab assignment this week uses dots to represent the population of southern Florida. We used census counts, with counties as the enumeration unit. In my map, below, each dot represents 20,000 people. The map was created in ArcMap with some finishing touches applied in Adobe Illustrator. (And I've realized after the fact that the title should say "population distribution" rather than "population density"--oops!)

Without additional input from the mapmaker, ArcMap places dots randomly throughout enumeration units. So, we used an additional layer (not shown in the final map) extracted from a landcover type dataset to have ArcMap place dots only within urban land areas--in other words, developed areas where we know people actually live. This gives us a more realistic and accurate distribution map that allows us to see where the human population is concentrated (with the highest density occurring along the southeastern coast around Miami) and where it is more dispersed (such as much of the interior of this part of the state). 


Thursday, March 30, 2017

GIS Week 12: Geocoding and Network Analysis

This week in Intro to GIS, we looked at geocoding (locating addresses) and network analysis. We used a TIGER/Line road network file from the US Census Bureau to create an address locator that was used to geocode the addresses from a table of EMS Station locations in Lake County, FL. Most of them matched automatically, but a few had to be located by hand due to discrepancies between the road data and the address table. Then, we chose three of those stations and set up and ran a network analysis to find the fastest route between them. Below is the map that resulted, showing the locations of all the EMS stations along with the best route to get to three selected ones:


Monday, March 27, 2017

GIS Week 10: Vector Analysis

For this lab assignment, we used buffer and overlay tools in ArcMap to produce a map of possible campsites in a section of DeSoto National Forest. Roads and water features were buffered at various distances, and the two resulting sets of buffers were overlaid to find areas that were within the appropriate distance of both roads and water. Finally, areas that fell within conservation areas were eliminated, leaving only those areas that are close enough to roads and water to be suitable campsites, but outside conservation areas.


Sunday, March 26, 2017

Cartographic Skills Module 9: Flow Line Mapping

This week's lab adventure was creating a flow map, or a map that depicts movement (of people, goods, traffic, etc.) from one place to another. In this case, the dataset was numbers of immigrants to the United States from various parts of the world, and the map has two parts: flowlines showing total numbers of immigrants from each region, and a choropleth map of the U.S. showing the percentage of those immigrants that ended up in each state.

This map was more fun to make than I thought it was going to be when I first realized the whole thing needed to be done by hand in Adobe Illustrator (a personal nightmare). There are a lot of stylistic things you can do with this kind of map, starting with drawing the flow lines themselves and having complete control over what they look like and how to place them. For this assignment, we drew the biggest line first, tweaking it until it looked good, and then used Excel to calculate proportional widths for the remaining lines based on the width of the first one. That way the relative size of the lines reflects the numbers of people they represent. 

Once I had the lines ready to go, I made them partly transparent so as not to hide country or continent boundaries, and added drop shadows to make them pop out to the reader. (I also used a drop shadow to make the U.S. stand out, to draw attention to the choropleth data there and to the fact that it's the endpoint for the rest of the data.) I also used a little bit of a gradient pattern for the stroke of the flow lines, and finally, I used AI's text on a path tool to make the flow lines' labels match their curves.

One other thing I did with the lines, which wasn't really part of the assignment, was separate out Canada. The data we were using was totaled by continent, but that meant North America was going to either need two separate arrows or two converging arrows in order to not be misleading, and either way I wanted to get the proportions right. Fortunately, the Excel spreadsheet also broke down the immigration totals by country, even though we didn't need to use that part, so I was able to subtract Canada from the rest of North America and satisfy my compulsion for accuracy.


Friday, March 10, 2017

Cartographic Skills Module 8: Isarithmic Mapping

On the surface, this was a pretty easy week in Cartographic Skills, but I'm continuing to struggle with some of the finer points of using Adobe Illustrator, and this week's map is not my best. So it goes.

The topic of the week is isarithmic mapping, Isarithmic maps are used to map continuous data, such as temperature or topography. The data can be symbolized using graduated color, contour lines, or both. In the lab exercise for this week, we looked at a dataset for annual precipitation amounts across Washington state. Although the data was first mapped using continuous tone symbology, in which the data is unclassed and stretched over the full range of a color ramp, for the final map, pictured below, the data is mapped using hypsometric tints. It is divided into classes (we set the ranges manually) with a color assigned to each class. In this case, contour lines were also created that fall at the divisions between classes, to make it easier to distinguish them when reading the map. 

This type of map, which illustrates the yearly average precipitation over a 30-year time span, is typically used to study long-term climatic trends. This particular dataset was created using the PRISM system, which relate data collected at weather monitoring stations to elevation and other variables in order to interpolate the data with respect to known weather patterns as well as location. The blurb on the map explains PRISM in a bit more detail.


Tuesday, March 7, 2017

GIS Weeks 7-8: Data Search

For the midterm lab assignment for Intro to GIS, we were asked to acquire a list of datasets for a county in Florida, project all of them into the same coordinate system, "clip" state-level or other larger datasets to the shape of the county in question using the tools available in ArcMap, and create 1-3 useful maps.

I was assigned Flagler County, on the state's Atlantic coast. My first set of maps, below, includes one illustrating some key natural and man-made features in the county, including waterbodies, cities and towns, roads, and parks, and one showing Strategic Habitat Conservation Areas symbolized according to their priority ranking (with cities and roads marked to help orient the reader). For both of these maps, the data is overlaid on a Digital Elevation Model, allowing the reader to see how all the features represented interact with the county's terrain.



I had one other dataset that I felt was too complex to be combined with any others, so my third map just shows land cover categories for the county. However, one of the required datasets was a DOQQ, or digital orthophoto quarter-quandrangle, so I also included an inset zoomed in on part of the city of Palm Coast, where the land cover dataset is overlaid on top of the aerial imagery.




Sunday, March 5, 2017

Cartographic Skills Module 7: Choropleth and Proportional Symbol Mapping

In this assignment for Cartographic Skills, we tackled two types of thematic maps, choropleth and proportional symbol, by creating a single map of Europe showing population density (as a choropleth map) and per capita wine consumption (as a proportional symbol overlay).

The map was created in ArcMap using provided data and finished using Adobe Illustrator (that was a LOT of symbols and labels to adjust manually--yikes). I chose natural breaks classification for both population density and wine consumption, because I thought it visualized the data best, showing the overall variation while also capturing outliers at the high end of each range (for example, the Netherlands for population density, and Vatican City for wine consumption, which ended up being a class by itself because it's so much higher than everything else). ArcMap classifies the data automatically based on an algorithm that looks for existing gaps to separate classes. I also chose to use graduated symbols for the wine consumption layer rather than proportional symbols--same concept, but slightly different presentation. With graduated symbols the data is divided into classes for which the whole range of data is assigned one symbol size, and that size isn't directly related to the data value. I thought that in this case it was easier to distinguish the symbols this way (that's also why I used only four classes--I found if I used five, it was hard to tell the difference between the middle symbols anyway, which sort of defeats the purpose) and easier to understand what the symbols represent.

Anyway, the map:

Some general trends you might notice are that Western Europe tends to have higher levels of wine consumption, while Scandinavia and much of Eastern Europe seem to drink less wine. The colder parts of the continent, predictably, tend to have lower population densities. I don't see any very obvious correlation between population density and wine consumption, though that could be a function of the way the data ended up being represented here, but it's still an interesting map.

Thursday, February 23, 2017

Cartographic Skills Module 6: Data Classification

This week was all about data classification, or how to categorize your data in order to display it on a map. For the lab assignment, we worked with census data for Miami-Dade County, FL to map two related demographic variables using four different classification methods. As you can see, the classification method makes a big difference in how the data is ultimately visualized, which is why it's important to fully consider the nature of the dataset and the purpose of the map before choosing a classification method.



Pictured here are the equal interval, quantile, standard deviation, and natural breaks methods of classification. All four maps use the same dataset and show the number of senior citizens per square mile in each census tract (the other set of maps from this lab illustrated the percentage of the population over 65). In each case the data classification was performed automatically within ArcMap. In equal interval classification, the data is divided into classes that each cover the same range of data values, which in this case shows us that most of the county has a low density of people over 65, but doesn't show a lot of variation. The quantile method divides data into classes that each have an equal number of data observations, which definitely brings out the subtle variation in this dataset, although you have to look closely at the legend and realize that most of the classes are pretty close together in terms of the values they contain. The natural breaks method tries to group like values and make each class as different from the others as possible, which can lead to some weird ranges of values for those classes, but does bring out some variation in the data without overstating it. Standard deviation is a bit different. It classifies data values according to how far they fall from the mean (i.e. within one or two standard deviations above or below the mean). In this case, that shows us that the density for most of the county is below the mean, with a cluster of tracts in the northeast that are around or above the mean.

Tuesday, February 21, 2017

GIS Week 6: Map Projections, Part 2

This week's lab was a continuation of our exploration of spatial referencing and map projections. For this complex assignment, we were required to download data in various formats, from various sources, and make sure all of it ended up in the same projection. Some of the datasets were already ArcMap ready and just needed to be re-projected using the Project tool provided in ArcMap; in other cases we needed to define the coordinate system for a raster image file and import x,y data from a spreadsheet, define its coordinate system, create a shapefile, and re-project it to the correct projected coordinate system (in this case, State Plane, Florida North).

The screenshot below shows the end result, with all the data lined up correctly in the right place. Pictured is aerial imagery for USGS quad 5160 in the far southwestern tip of Escambia County, FL, overlain by Florida county boundaries and the USGS quad index, along with major roads and point data indicating the location of petroleum storage tank contamination monitoring sites.


Tuesday, February 14, 2017

Cartographic Skills Module 5: Spatial Statistics

This week in Cartographic Skills, we went through an ESRI training course on using spatial statistics to explore data in GIS before performing any analysis. ArcMap includes various tools that allow you to look at things like how your data is distributed, whether it's autocorrelated (meaning whether values are dependent on location -- pretty important for spatial analysis!), how variable it is across your study area, and whether there are any identifiable trends. All of this information can then be used to make an informed decision about what kind of analysis would be best to use. These tools also offer several ways to spot outliers in the data so that they can be corrected or removed so as not to skew the analysis.

While exploring the statistical tools available in ArcMap, we worked with temperature data collected from weather monitoring stations in western Europe. The map below shows the geographic distribution of those stations, as well as the mean center and median center calculated from the locations of all the monitoring stations in the dataset. The red oval indicates the directional distribution (roughly east-west) of the location data, and encircles all the stations whose location is within one standard deviation of the mean center. All of these were calculated within ArcMap using tools found in the standard spatial statistics toolbox.

Sunday, February 12, 2017

GIS Week 5: Map Projections, Part 1

This week we started learning about map projections and how to re-project spatial data in ArcMap. This week's lab assignment was to display Florida counties using three different map projections, thereby illustrating the way different projections display (and distort) information differently. As you can see from the map below, where all three projections are lined up side by side, Florida's shape and orientation look slightly different depending on which coordinate system is used. More than that, sizes are affected. We used ArcMap to calculate the area for each of four counties in each map projection, and came up with three very different sets of numbers. Some parts of the state aren't affected too much, but others, specifically those farther away from a particular projection's most accurate region, vary a lot. Based on this, it's easy to see how important it is to make sure all of your data uses a consistent coordinate system before trying to do any spatial analysis.


Saturday, February 11, 2017

Module 4: Cartographic Design

In this lab assignment for Cartographic Skills, we were tasked with making a map showing the locations of public schools in Ward 7 of Washington, D.C. We also needed to display the schools by type (elementary school, middle school, and high school). Since this week's lesson was about good cartographic design, this map is also an opportunity to showcase the principles of visual hierarchy, contrast, figure-ground relationship, and map balance.

To that end, my first step after assembling all the data was to weed out unnecessary information so as not to distract from the map's purpose (the schools in Ward 7). I clipped the schools and parks layers to display only those in Ward 7, and I chose to display all the roads within Ward 7, but only major roads and highways in the surrounding area. I also made the Ward 7 polygon a light grey in contrast to a darker grey for the rest of D.C. and a light purple for the background. All of this was to establish figure-ground, where the eye is drawn to the area of focus. The roads and neighborhood labels are also shades of grey, dark enough to be seen but similar enough to the other base information that they aren't distracting, while the stark contrast of the red school symbols makes them stand out as the focus of the map. (I opted to let the parks stand out a little, too, to show their distribution in relation to the schools, but the schools still stand out most because of the bright color.) In other words, the visual hierarchy emphasizes the schools over the base information. In order to distinguish between the different types of schools, I employed contrast again in making the symbols different sizes. Finally, to achieve good balance, I centered the title and placed the legend and inset map so that each was occupying one large empty space, and spread out the remaining map elements to fill in the rest of the available space at the bottom of the page.

I was a little pressed for time this week and since using Adobe Illustrator wasn't required, I opted to make and finish my map entirely in ArcMap, with which I'm more familiar.


Sunday, February 5, 2017

GIS Week 4: Map Sharing

In this week's GIS lab, we learned about different ways of sharing map data with others and the public. The assignment was to create a geocoded "Top 10" list (in the form of a table of addresses tied to a map layer of point features) that was saved/presented three different ways: as a publicly viewable webmap on the ArcGIS website, as a map package that can be shared to use within ArcGIS Desktop, and as a KMZ file for use with Google Earth.

My Top 10 list is simply a list of some of the breweries and wineries in the area of central Virginia where I lived until recently. I thought that was a fun subject, and my familiarity with the area made it easy for me to see that my data was geocoded and displayed correctly.

If you're interested, you can view my webmap here: http://arcg.is/2l7ulzx.

Saturday, February 4, 2017

Cartographic Skills Module 3: Typography Lab

This week's assignment for Cartographic Skills was to create and label a map of Marathon, in the Florida Keys, in accordance with the lesson material on good cartographic design and typography. The base map and inset map were created in ArcMap and exported to Adobe Illustrator. I consulted with Google Maps to locate each of the places that needed to be labeled.

I first labeled the islands, in all caps as is suggested for areal features, using AI's "type on a path" function to place labels directly on the islands where they would fit, and using leader lines for those that were too small. Bodies of water are also labeled in all caps, and I made the type blue and made the customization of shearing the text to give it the appearance of being italicized, both of which are conventional for labeling hydrographic features. Since the city of Marathon actually covers most of the islands pictured, I opted to indicate it with areal color (green) rather than as a particular point. Finally, I placed point symbols at specific locations on the map to indicate towns, the airport, a state park, and a country club, and labeled these using "normal" text (but black to differentiate from the dark grey used for the island labels) and leader lines, since none of them could be labeled without the label overlapping the coastline or another feature. I made two modifications to the symbols, all of which were from the libraries included in AI: 1) I made the state park symbol a darker green than the default to make it stand out better, and 2) I gave all the symbols a little bit of a drop shadow to help them pop out to the reader, since they're kind of small and this is a moderately busy map.

While Illustrator is a powerful tool, I'm not finding it to be particularly intuitive to use, and it seems to me that it's a lot more work to label by hand and create a legend from scratch than it would be to do those things in ArcMap first and then use AI to edit them and add finishing touches. Hopefully as the course progresses I'll get better at using AI, or if not then at least get to experiment with other approaches to finishing maps.

Thursday, February 2, 2017

GIS Week 3: Cartography Lab

This week was all about incorporating cartographic best practices into our GIS outputs, and the lab assignment walked us through creating three different maps of Mexico, each highlighting different types of data and how we could use ArcMap to present that data clearly and attractively. This week was also the first time in this course that we looked at raster data (a DEM) and delved into customizing a map's legend.

The map I'm choosing to share (below) isn't my favorite from this week, but it's the one I feel I got the most out of making. For this map we were tasked with simplifying a rather complex (and therefore difficult to read) set of transportation features, by using the attributes tables to single out only the most important rivers and highways for display and ordering the map layers so that everything was as visible as possible. (I have mixed feelings about my red road network, but needed it to be distinct from the black railroads and also needed it to be easy to see without being easily mistaken for a natural feature.) We also played with the city labels to make them more readable and to make the Distrito Federal stand out. The inset map shows which area of Mexico the main map displays, and the coolest thing I learned this week was how to make ArcMap automatically highlight the extent of the main map on the inset map.

Thursday, January 26, 2017

Cartographic Skills Module 2: Intro to Adobe Illustrator

This week's lab was all about getting acquainted with Adobe Illustrator and how it can be used to put finishing touches on maps created in GIS and edit map layouts. I've only ever exported/printed maps directly from GIS, so this was a new experience. Figuring out how to use Illustrator has been challenging, but there's so much you can do with it!

Below is my finished map. The base map was created in ArcGIS from data provided, and I used Adobe Illustrator to adjust the colors, as well as the transparency of the water features so that county boundaries are visible underneath, to change the city symbols (using a find/replace script also provided for the class), to label some of the cities, to edit the legend, to resize the map (along with its scale bar, of course!), and finally to add images, various text elements, and a background color and to finalize the layout. 

Monday, January 23, 2017

GIS Week 2: "Own Your Map" Lab

Below is the map that resulted from this week's lab focused on map design in ArcGIS. This lab was a much more thorough treatment of how to customize map elements and overall layout than I got from previous GIS courses. The actual assignment here involved a map showing the location of UWF's main campus within Escambia County, FL, along with an inset map highlighting Escambia County's location within the state. (As an online student who's never set foot in Florida, I didn't even realize until now that UWF is almost all the way in the northwest corner of the state!) Nearby towns and interstates are also presented for additional context about UWF's location, which is indicated by the circled star to make it stand out. 

Wednesday, January 18, 2017

Cartographic Skills Module 1: Map Critique

This assignment asked us to find and critique two maps, an example of a well-designed map and one of a poorly-designed map, judged on the basis of Edward Tufte's list of map design principles detailing what makes a good and useful thematic map.

As my well-designed map, I chose this one from the U.S. Census Bureau:



This map does a good job of clearly and concisely presenting the largest ancestry group in each county of the U.S., both providing that information for individual counties and visualizing regional patterns and variation. While there would surely be even more variation at an even smaller scale (such as census tract), using the county-level data brings out patterns that are hidden by state-level data without making the map so packed with information that it’s hard to see what’s happening. It follows Map Principle 2, “complex ideas communicated with clarity, precision, and efficiency;” it’s clear at a glance what the map is showing, and there is no ambiguity or unnecessary information. It also follows Map Principles 7 and 8 about clear and thorough labeling, by providing a prominent legend identifying the significance of each color displayed as well as a list of what’s included in the “Other” category. A few aesthetic aspects of this map that particularly appeal to me are its simplicity (there’s a lot of information presented because the map covers a large area, but it’s only two kinds of information, county outline and biggest ancestry group), its visual appeal (colors are readily distinguished from one another but none of them are blindingly bright or clashing with what’s around them), and the use of the ancestry by state inset map, which I think adds context and emphasizes the amount of variation at the county level.

As an example of a poorly designed map, I chose this one from mapsofworld.com:


The biggest problem with this map is that it supposedly shows countries according to language, but doesn’t explain what this categorization signifies. Are they mapping the majority language? The official language? Something else? Since many countries have multiple languages spoken within their borders, it’s important to be specific about what this map illustrates. While the layout and overall design of the map isn’t terrible, it’s hard to get any significant information out of it, which is a big problem. These issues are in violation of map principles 2 & 3, about clearly and efficiently mapping big or complex ideas, and possibly Principle 5, which “requires telling the truth about the data,” instead leaving the viewer with questions or assumptions about what the data actually mean. I would suggest improving this map by providing a clearer explanation (even a more specific title could potentially accomplish this) of what information is being presented, explaining or eliminating the inconsistencies in categorizing that information (e.g., what distinguishes countries labeled “multi-lingual” from those that have been assigned an individual language despite having multiple languages spoken by their populations, and why are some countries simply labeled “other”?), and being more careful with color choices to avoid confusion between very similar colors that mean different things.

Saturday, January 14, 2017

Week 1: ArcGIS Overview Lab

The first GIS assignment in our introductory course was an easy one designed to familiarize the class with the basics of working with data in ArcGIS. This lab was basically a review for me, as I worked with ArcGIS as an undergrad, but it's nice to have a little refresher since I've been away from it for a few years now, and nice to know that I am starting to remember what I learned now that I've seen the program again! (I'm actually a little annoyed that we didn't go into changing the settings for the legend and breaking up the population data in a way that makes more sense, but I know for anyone seeing ArcMap for the first time this week, this was probably already an overload of information.)

In any case, here is the map I produced following the lab instructions, which shows the countries of the world according to their population sizes--the darker the color, the higher the population (as of 2007, the data we were using).