• Skip to primary navigation
  • Skip to main content

North River Geographic Systems Inc

Spatial Problem Solving

  • Home
  • About NRGS
  • Blog
  • Resources
    • Guides for using TN Data with QGIS
    • QGIS Resources
    • Tutorials
  • Services
    • Support Services
    • Tennessee NG911 Address Server
    • Training
    • Forestry Database Services
    • Conservation GIS
    • Data Analysis
  • Portfolio
  • Show Search
Hide Search

Python

Modeling Part 2

rjhale · Sep 22, 2020 ·

Not being content with my last model….

.

I dove back in because I really need to name the points something because the forestry guys almost always use GARMIN GPS equipment. After sitting on this for a week the question became “How do I name everything” and get it to the point where the model handles 90% of it:

  1. Name the point and since I’m exporting GPX the field has to be “name”
  2. Perform some sort of field calculation to get a “pt1” for a name
  3. Label it in QGIS because I really want to be lazy.

I found the rest of the tools to let me do this:

  1. Add autoincrement field which gave me a 1 through X number
  2. Field calculator let me name this ‘pt’ || “AUTO”
  3. Set layer style

My Model now looks like:

Not to go over the additions in excruciating detail but three tools later and I have a functional point layer. My next question is can I slap this in github and share it – which I’m going to do shortly. Will this break the layer style? I’m not sure – but I can save this with a ton of notes for more development.

I’ve really ignored the functionality of the Processing Toolbox. This small dive into it has highlighted some sore spots I’ve had with the TN 911 address project. I get frustrated because I’m not the best in Python. The big problem I’ve had has been working through a few problems that are going to require some sort of interaction between the 911 address person and the database and I can’t do that part in SQL (I could – but it would create too much black magic). I can fix it with a model though.

Then all my python fears get a little bit better because all of this can be moved to python and I can pick at it.

My 10 to 15 minutes of point building are now down to 10 seconds.

Now to get them using something other than a Garmin GPS Unit – like their phones.

 

Modeling in QGIS

rjhale · Sep 9, 2020 ·

I should model more….but I don’t. You’re welcome.

Anyway – the last few weeks I’ve been boring you with tales of topology and now I’m going to bore you with tales of building a model. One of my longest running jobs has been in Forestry and that’s pretty much a set of repeatable processes. With the current lockdown I’ve been experimenting with client data and how to speed up some of what I’ve been doing.

QGIS has a modeler. In ArcGIS, Modelbuilder was probably a favorite because I’m always doing repeatable processes. Granted I’ve offloaded a ton of things to PostGIS BUT – wouldn’t if be cool if…..

The Forestry guys get a new property to look at and I usually do something clumsy to get it into QGIS. If it’s an inventory they go and walk it. They usually walk it at a certain interval so they call me and I:

  • Create a Grid using the Regular Points processing tool.
  • Manually Delete the points that fall too far outside the boundary
  • Name the points

This takes anywhere from 5 minutes to 15 minutes depending on the property and the how well we feel the boundary fits the ground. A simple model would look like this:

With the QGIS Processing Modeler you have to set up an Input before you can do anything. So my input will be a vector layer. I will drag the Regular Points Algorithm in and give it an output. If I double click Regular points I need an extent to draw the points. I’m going to set that to a model input. I give it a name and you can see how everything is starting to connect. I also give it some spacing which is 264 feet (which is 4 chains):

Not bad but not great:

but I need to clip it to the boundary……and assuming the boundary is wrong I probably need to clip include some points outside the boundary. I also don’t want to open the model every time and change the spacing. Soooooooo…..Lets get the the “Poof” Magic part:

What did I do?

  • Buffer the input property
  • Create a grid based on this buffered polygon
  • Extract the points from the buffered area
  • Create a new layer

That’s almost perfect. So 5 minutes down to this. Of course there are things I need to change like “what if they want an irregular cruise” that might be a 4×3 or a 5×3. I haven’t quite figured out numbering the points but I”m assuming I’ll stumble into it it shortly.

I haven’t used the modeler in quite a while but with some extra time I’m going to shrink some of my processes to as “quick as possible”.  Probably play around with Python some but I don’t really have to – this works for what I need to do with some small adjustments.

 

 

 

 

 

QGIS Processing: History

rjhale · Jul 30, 2020 ·

If you’re anything like me (hopefully you aren’t) you’ll do something and immediately wished you hadn’t closed the processing window. Clip, simplify, select by location……

So in this case I had a call about contouring a Digital Elevation Model. Contours are easy. I ran them at 10…or maybe 20.

QGIS and DEM

I decided to clip them with the county boundary since the DEM extends outside of the county into the neighboring counties and states.

QGIS and Clipped DEM

So the question is now “WHAT DID YOU DO”. You get up. You come back. Cat jumps in your lap. Your Mom calls. QGIS keeps a history of your processing commands. If you look at the processing toolbox you’ll see a small clock.

Click it and another windows opens

Here is the cool thing. You click any of the commands and you’ll be taken back to your original QGIS Processing menu. So if I can contours and don’t like them at 20 foot intervals? Go back and check the history, double click,  and change it to 10 and run it again. You like python? You have what you need to know and you can run that from the python console. I made one small change and dumped it to a shape this time vs a memory layer:

So if you do forget a tool setting in the processing tools after you closed it – just run it again by looking at your history for the processing toolbox.

 

The Second Day of XYMas….GDAL

rjhale · Dec 16, 2019 ·

So going back to the college kids and the “OMG I gotta learn python”. It actually it helps if you do learn python. It will help your career and it’s always nice to know how to program in something….

BUT

I will submit on this the second day of XYMas that you should learn GDAL before you dive into python if you’re interests are centered around location. GDAL is the underpinning of a lot of GIS Software (ArcGIS and QGIS just to list 2). It’s command line based (See the previous post) and it’s all powerful as you can manipulate vector data and raster data and there are a ton of resources for using it with python out on the internet. No license required. All you need is a strong heart and nimble fingers. Remember – GDAL tools work on raster and OGR tools work on vector. That’s lesson 1.

As my small working example or why this knowledge will serve you well:

  1. Lets download Significant Earthquakes from the USGS as a csv file: https://earthquake.usgs.gov/earthquakes/feed/v1.0/csv.php
  2. I’m going to use ogr2ogr to move it into something other than csv:  ogr2ogr -f GPKG equakes.gpkg significant_month.csv -oo X_POSSIBLE_NAMES=longitude -oo Y_POSSIBLE_NAMES=latitude -a_srs ‘EPSG:4326’
  3. I’m now going to query all the earthquakes > than a 5.0 from GDAL: ogrinfo equakes.gpkg -dialect sqlite -sql “select mag from significant_month where mag > 5.0 order by mag desc”

For fun I’m going to drop it into QGIS just to see everything (this was pre-New Zealand Volcanic Eruption):

QGIS 3.10

There are a few things I could do to make this better – like making sure the magnitude is a number field and not a string but for the example it works.

Last year I had a client suddenly surprise me with a “Hey – we gotta deliver some data to the state and it has to be in an ESRI acceptable format”. Which – no big deal except I had spent a good deal of time moving them to postgis to get away from ESRI. A year ago we set up a script and it’s been running for 15 months or so. We’ve upgraded software – we’ve changed servers. With a few modifications it’s still running.

ogr2ogr -f GPKG TCStransport.gpkg -nlt point PG:”host=host user=user dbname=dbname password=pass” -nln “addresspoints” -sql “select id, oirid, geom, r_segid, a_segid, seg_side, gislink, cast(structype as SMALLINT), strucdesc, stnum_h, stnum_l, stnum, stnumsuf, building, floor, unit_type, unit_num, secuntnum, predir, pretype, name, type, sufdir, postmod, address, addr_esn, label, subname, vanity, zip, zip4, esn, city, county, state, cast(source as SMALLINT), lon, lat, x_sp, y_sp, z_val, gpsdate, addrauth, editor, geomod, geosrce, geodate, attmod, attsrce, attdate, cast(status as SMALLINT), delnotes, right_left, comment, readdress, old_address, color, point_type, street, cad_name, loc_note, cad_post from addresspoints”

Think about that – a small program that I (2nd worst programmer in the known universe) wrote that has now run through 1 server upgrade and 3 software upgrades without breaking. Every night for over a year. I should have set some sort of $3.50 a day payment plan based on success.

….and I’ve not even discussed the work you can do with Raster data. Actually – lets discuss the GDAL part. I uncompressed a SID image and needed it for a client – so I ran this to compress it back to something normal sized:

gdal_translate -co TILED=YES -co COMPRESS=JPEG -co PHOTOMETRIC=YCBCR -tr 1 1 -mask 3 ortho_1-1_hc_s_tn065_2018_1.tif NAIP_2018.tif

Could I have added more “things” to make it better/smaller? Probably – but it worked. So I can take any sort of image file and un-compress, compress, change the format, reproject, etc……All from GDAL. It works with everything including ArcMap/Pro/Whatever it’s called.

NO LICENSE REQUIRED. IT WORKS WITH PYTHON FOR YOU PEOPLE WHO WANT TO LOVE PYTHON.

For this second day of XYMAS I point you to:

  • An excellent tutorial by Sara Safavi: http://www.sarasafavi.com/intro-to-ogr-part-i-exploring-data.html
  • The GDAL Documentation: https://gdal.org/
  • The Python Cookbook: https://pcjericks.github.io/py-gdalogr-cookbook/

So I do ask you to learn GDAL and then move into python. It’s great software. It’s an active community. You will not waste your time learning this skill – if you’re going to stick around the GEO world you’re going to touch it at some point (or you should – if you don’t I have questions on why).

 

I didn’t know it was called Refactor Fields in QGIS

rjhale · Mar 2, 2018 ·

The title may be the worst part of this post. Hopefully. Anyway – here’s what happened. I will explain it in all of it’s goofy detail with some hope that by the end you’ll do the same thing I did.

I’ve a client that is currently  making me push what I can do with QGIS/PostGIS. We hit a technical bump with data collection and to fix it they walked in with a new app they had purchased from some online app store that records data from their super accurate GPS Unit. The manufacturer of the GPS unit had one – but they found this scratched the itch more than the one they could have used….and the one they were using. My eye didn’t twitch much. I just cautioned against randomly doing things with your setup. I mean what if it makes a file you can’t import into your database. What if………

They have in postgresql a table that was more or less this: column1, column2, column3 . This app could create several data formats that ultimately me with: Foo1, column2, Bar3. You can’t just “copy and paste” the data in because the columns need to match. So I need to force it to match. No matter what trick I tried it wasn’t easy BUT some of them did work.

An Example of my thought process.  My python skills aren’t great – BUT – using the psycopg2 library, some ogr, and “the google” I was able to connect and change my data and push into Postgresql. More or less you parse out the columns and match them to the PostgreSQL table:

import psycopg2
import osgeo.ogr 
#import shapely

shapefile = osgeo.ogr.Open("/home/rjhale/gis/shapefiles/monument.shp")
layer = shapefile.GetLayer(0)

conn = psycopg2.connect("dbname='database' user='me' \ 
       host='localhost' password='itsasecret'") 
cur = conn.cursor()

for i in range(layer.GetFeatureCount()):
      feature = layer.GetFeature(i)
      descriptio = feature.GetField("descriptio")
      pos_x_sf = feature.GetField("Pos_x_sf")
      pos_y_sf = feature.GetField("Pos_y_sf")
      elev_z_sf = feature.GetField("Elev_z_sf")
      photos = feature.GetField("photos")
      local_date = feature.GetField("local_date")
      pdop = feature.GetField("pdop") 
      #Get feature geometry
      geom = feature.GetGeometryRef()
      #Convert geometry to WKT format
      wkt = geom.ExportToWkt()
      #Insert data into database, converting WKT geometry to PostGIS geometry
      cur.execute("INSERT INTO test.monument(descriptio, \ 
                   pos_x_sf, pos_y_sf, elev_z_sf, \
                   photos, local_date, pdop, geom) \
                   VALUES ('{}', '{}', '{}', '{}', '{}', '{}', '{}', \ 
                   ST_GeomfromText('{}',6447))"\
                  .format(descriptio, pos_x_sf, pos_y_sf, elev_z_sf, photos, \
                  local_date, pdop, wkt))

 

Holy crap. It worked. I really need to up my python game BUT – no one wanted to run something from cmd line. MAKE A PLUGIN! Which I considered. In my digging I found the refactor fields tool  in QGIS. It’s very similar to loading data into a database with ArcGIS for those of you stuck on that side of the fence.

Refactor Fields

So how does it work? It’s easy. Load your data that doesn’t quite match anything as the input layer. Load the table you want it to match in the “load fields from layer” part at the bottom. Click Load Fields.

loaded refactor fields

So what just happened? Refactor fields took my shapefile and matched it to the columns in PostgreSQL. If it can’t “guess” you have the option to make it match by typing in the fields. It generates a new layer that I can “copy and paste” into the database or import into the database. For some of the data I’m having to clean out the fields – so things that don’t match in the shapfile I’m removing from the new layer using this processing tool. Which – maybe I need to add those fields to PostgreSQL. I really want this table stabilized before we start. Although I’ve not tried it I’m guessing this works with any layer you want – I’m just using it against postgresql. I also will say this probably won’t match projections – so get your projection worries figured out before hand – this just handles the attributes.

Right now I’m  looking at a model so they can ultimately just “ClICK SOMETHING”. We all like menus. Mostly.

Anyway – load your data. QGIS has many tricks to make your life easy and I had no clue. I did learn something on the Python side of life though and for that I’m happy even though it turned into a dead-end.

 

 

 

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to Next Page »

Contact

  • (423) 653-3611
  • info@northrivergeographic.com

Copyright © 2021 · Monochrome Pro on Genesis Framework · WordPress · Log in

  • Home
  • About NRGS
  • Blog
  • Resources
  • Services
  • Portfolio