• Skip to primary navigation
  • Skip to main content

North River Geographic Systems Inc

Spatial Problem Solving

  • Home
  • About NRGS
  • Blog
  • Resources
    • Guides for using TN Data with QGIS
    • QGIS Resources
    • Tutorials
  • Services
    • Support Services
    • Tennessee NG911 Address Server
    • Training
    • Forestry Database Services
    • Conservation GIS
    • Data Analysis
  • Portfolio
  • Show Search
Hide Search

GeoServer

Dockering with Docker

rjhale · Dec 1, 2020 ·

It’s been a while since I posted anything mostly because I have no clue what to talk about these days. I’ve been working away on the Tn Address Server and one thing I wanted to do is to get it working in docker. Which – many of you are probably going IT’S NOT THAT HARD. Generally you are correct but it’s still a weird area for me to jump into. I’ve toyed with Docker for years – in general it works but I hardly ever have a reason to do much with it. Except now with the way I’m working on the TN address server I needed a way to “spin up” a postgis server and test everything and then take it down.

So I jumped back into Docker. Again.

You can build your own docker instance. You create a dockerfile and start adding whatever you need but I quickly decided I didn’t want to build my own dockerfile…yet.  After doing a bit of research Kartoza already has one built. Also – they have one for Geoserver . Geoserver has been one of those things that I don’t have quite integrated into the flow of the address sever yet – but I’m working on it.

Sitting down with Docker I had four things I wanted to do:

  • Functional database I can start and stop
  • Local Storage
  • I can sit in another room and work and still get to it on this machine.
  • I don’t want to pile of things installed on my computer.

Simple? Yes…mostly. It seems like for most of my career I’ve been dancing in between developer, system administrator, and geo person. I never seem to have complete control over any of the things except being a “geo person” scratches the itch more than the other two. So that started a day long “How do I…..?” which resulted in again reaffirming that stack exchange is the devil, documentation can be boring, and a lot of people can’t write anything to save their lives (myself included about half the time).

The biggest headache was keeping my data permanent on my hard drive. I can mount the geoserver docker to a directory  – why can’t I mount the postgis docker….and surprisingly there is a wealth of misinformation out there. In short – you have to make a volume for postgis to work – which is basically doing the following:

  • Create a /directory/somewhere
  • Create a volume: docker volume create –driver local –name pg_data –opt type=none –opt device=/directory/somewhere –opt o=bind

Finally I did the following:

docker run –name=postgis -d -e POSTGRES_USER=user -e POSTGRES_PASS=pass -e POSTGRES_DBNAME=tndemo -e ALLOW_IP_RANGE=0.0.0.0/0 -p 5432:5432 -v pg_data:/var/lib/postgresql/ kartoza/postgis:latest

Then

docker run -d -p 8580:8080 –name “geoserver” –link postgis:postgis -p 8080:8080 -v /media/data/demo/geoserver_data:/opt/geoserver/data_dir kartoza/geoserver:latest

PostGIS and Geoserver up and running in harmony…in docker. I was pleased. I can connect geoserver to postgis. I can connect to both things from my couch while I listen to the Baby Yoda Chronicles.

I was also a bit dismayed. Mostly at me.

To burn the amount of time I did getting to a functional setup was way too long. Not anyone fault as I have directions, functioning internet, and the ability to ask questions in the community. Which has been a problem for me as of late – I remember back to past failures and start fresh on a project and get haunted by the past failures. So much so I’ve been “tuning out” more. The pandemic has made things frustrating enough without my brain getting in the way. So I’m back to the art of learning this month and tossing out the old.

Now that this is running I can build and break the database at will now and hopefully get a running version of this that does everything I want it to do.

Maybe I’ll build a dockerfile just to say I did it.

Maybe I’ll do a few other fun things.

 

 

TN NG-911 Server

rjhale · Jul 24, 2019 ·

Last year I had the chance to work with Henry County TN’s 911 group on a software migration project. I moved the organization off a basic commercial GIS product to an open source enterprise server. Overall it was a great project with a defined end goal and a lot of flexibility in making it happen.

Once the project ended I never stopped working on the database and the tools. Over the last few months I’ve had the chance to turn the database into something that is 100% compliant with the State of TN’s NG911 System. I present the TN NG911 Server.

What does it look like?

  • Fulcrum for Mobile
  • QGIS for a GIS Desktop
  • PostGIS/Postgresql for the database
  • Geoserver to provide OGC compliant services

The workflow is simple. Field Personnel go out and collect data using menus set up in Fulcrum.

Data is stored in PostGIS/PostgreSQL and edited with custom menus in QGIS. The database is served out through Geoserver as an OGC service.

Data is exported from the server at night to the the states ESRI database.

The bonuses to this approach:

  • 100% TN NG911 Compliant
  • Cloud Deployment Or Onsite Deployment
  • Local TN Support for your organization
  • Can Interface with Computer Aided Dispatch Systems through managed Exports.
  • Quality Checks on your attributes
  • Topology checks for your data
  • Unlimited Desktop Deployments for everyone
  • Can work with TN’s LIDAR Data and TN’s Imagery Services.
  • Handles more than just the 3 data layers the state needs for NG911

Granted it’s not 100% open source as Fulcrum is a commercial product. Fulcrum will need to be purchased but it’s only a fraction of what you could be spending on software and it has more uses than just address collection.

The TN NG911 Server has a solid deployment that’s been in production a year. If you’re interested in integrating Open Source GIS into your 911 system you should at least look at this approach. Everyone needs good clean data.

Future plans have a plugin and more quality checks on the data. For Version 1 though this is very solid and very functional.

What about the NENA Standard? The TN Standard is not NENA Compliant! Well – stay tuned for more news.

PostgreSQL Hack

rjhale · Feb 21, 2019 ·

Many of you will look and go “OMG – Randy has a new hack for a database”. No – Randy is a hack at databases.

So about 4 or 5 years ago I was dropped into the idea of databases. I was working a job several thousands miles to the south and life was easier with a database as opposed to flat files. Up until that point I had flirted with databases and taken some online courses. I would make small databases but it wasn’t something I had to deal with day to day. I did deal with it indirectly – because at the time my employer was running ArcSDE on top of Oracle – BUT – as a lowly employee I wasn’t allowed to do much with it. That was the job of the admins. I made ESRI File based Geodatabases and was quite happy. Every now and then I’d even make Microsoft Access based ones.

So flash forward to this last year and I’ve been up to my neck in databases. A lot of you will shrug. You do amazing things with PostgreSQL and SQL server and it’s not that big of a deal. Last year I did my third big migration of flat Geo files into PostgreSQL/PostGIS. It was a fairly simple thing to do: QGIS, PostGIS, and I loaded Geoserver (but we still haven’t used it – I’m confident we will).

The client moved from a “ArcView/File based Geodatabse” environment to a multiuser database environment. They are a 911 and it’s a small county. They went from 1 person being able to work to 3 people working and viewing the data. I think currently they have 4 people adding data and updating data now. That’s a win.

My god the dots

What wasn’t a win was my moving of data into the database. File based Geodatabases don’t have a primary key. Maybe there is some way to deal with that these days – I don’t know. So they were using an extension called Attribute Assistant to deal with unique columns and sequences in ArcMap. I loaded the data and didn’t pay enough attention to what was happening.

To make a short story longer – I mucked up the unique column that the data had. I didn’t know it had to be assigned then never change – as explained “it just needs to be unique”. When I loaded it the Primary key and the Unique ID were off. Primary key was 4 and ID was 5. Look there is a primary key of 10 and a ID of 247. The unique ID had nothing to do with the county data – but everything with the state 911 database. So I calc’d the ID off the Primary key and that’s when things went stupid. So what magical GIS thing did I do? I didn’t.

I made a new index column.

update hc911.addresspoints set index = right(oirid, -6)::int;

Which ripped the number off the unique column and assigned it to a new field. I built a new sequence with a new incrementing value. I messed that up. I ended up redefining a new primary key. Fixed. No GIS knowledge applied – just databases.

How do you know you are tired? This is how!

Which – not a huge deal but for me a learning experience. I need to spend one day talking to the client and making out a path way before we migrate. Migrating data took 2 hours. Institutional knowledge probably would have taken 8 hours. The next one won’t have this problem. You’re going to have to talk to me a lot about your data and process. Way more than you want and that’s fine – we just need to talk.

Data is everything. To make everyone happy I set up a few cron jobs to dump data into Geopackage and Shapefiles using OGR. They’ve been quietly chugging along for 6 months now doing their work with no applause or fanfare. More people in the office are slowly going “hey you mean I can use QGIS to connect to the database? I can see the data?”.

I can say this whole move minus the ID hiccup has been a huge win for the organization. I was just lucky enough to be the guy to do it.

So if you’re an organization struggling with your commercial system – why not take a look at this side of the fence. We’re a lot of fun. If you’re wondering “Hey what should I learn these days” – I invite you to dive into PostgreSQL/PostGIS.


OSGEOLive 12.0

rjhale · Sep 6, 2018 ·

So for those of you going “HEY I WANT TO USE SOME OF THIS OPEN SOURCE SOFTWARE?”. Maybe you can’t install it locally…or maybe you have a spare machine laying around – there’s a way for you to do it.

Download the OSGEOLive image and run it as a virtual machine or boot your computer into the Distribution. It’s a Linux operating system with about 50 plus free and open source geospatial applications. 

They put a ton of effort into releasing this as a showcase of the software in the OSGEO realm. So give it a run and test out software you have questions about.

Boundless Suite 4.9.1 Virtual Machine

rjhale · Nov 12, 2017 ·

It’s been a while since I’ve dabbled with anything Boundless. So I had some time on my hands and decided to take a look at the newest community suite. I had intended to write up a review full of acronyms and and technical babble. I became a bit sidetracked so this is more of a soft review of the Suite and more of a “there’s a lot of stuff up there” post.

Anyway – for those of you who have no idea what a Boundless is: they are a commercial firm that provides support for open source products like postgis, geoserver, etc. They have built those components into a product called “Boundless Suite” (or Boundless Server). They have other offerings they’ve developed to help folks with geospatial data (like Boundless Connect). You can purchase support for the Boundless suite. If you don’t you can get the last release which is a community release for free. I downloaded 4.9.1 and my only complaint will be it is a bit dated. It was release in Dec 2016 if I’ve read everything correctly….BUT – I’m using it as we speak and loading data into it so it’s usable.

The nice thing with the suite is you get a feel of “how things work”. I get a lot of questions when I’m out and about on “What is an “Open Source GIS Server?”. The ESRI folk are a bit confused as that’s a Product you buy. Here it’s something you put together to fit your needs. In this case the Suite is:

  • Geoserver
  • PostGIS
  • GeoWebCache
  • GeoComposer
  • WPSBuilder

I won’t get into all the technical of what I did but in short:

  1. Download the ISO file from Boundless after creating a login into Boundless Connect
  2. Import it into VirtualBox
  3. Start the Dashboard

I’ve been exploring. The Geocomposer feature is something new and nice to play with. The Web SDK isn’t functional as downloaded but I’ve been digging and I think I can download the needed components. Which really highlights the documentation.

If you’ve been curious as to how all these components work together now is your chance to download the suite and READ THE DIRECTIONS. Which has pretty much derailed my techobabble post. I get bogged down enough in things that I sometimes miss software happenings outside of PostGIS/QGIS.  All the components have also made me dig back into web mapping. I know – not that big of a deal but for a one man shop I have to learn when I’m not doing everything else.

So on the blog post I ask myself a question. Would I run it? Yes and No. It works well. I’ve got a server capable of storing data, running services, and it’s functional within about 15 minutes. I hooked QGIS (2.18.14) to it and loaded data. I can also download the components and start building my own server setup after learning how all this works together to get more current components. Of course there’s always the support question – You can run a more current setup by paying support to Boundless. I have 0 idea of what that runs.

Anyway – I’m going to continue messing around with the VM and reading more of the documentation.  Learn some new software!

In between writing this – I loaded data into a map running from the VM

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Contact

  • (423) 653-3611
  • info@northrivergeographic.com

Copyright © 2021 · Monochrome Pro on Genesis Framework · WordPress · Log in

  • Home
  • About NRGS
  • Blog
  • Resources
  • Services
  • Portfolio